WAITE, Si (2015) Reimagining the Computer Keyboard as a Musical Interface. Proceedings of the international conference on new interfaces for musical expression. pp. 168-169. ISSN 2220-4806.
ReimaginingTheComputerKeyboardAsAMusicalInterface.pdf - AUTHOR'S ACCEPTED Version (default)
Available under License Type All Rights Reserved.
Download (259kB) | Preview
Abstract or description
This paper discusses the use of typed text as a real-time input for interactive performance systems. A brief review of the literature discusses text-based generative systems, links between typing and playing percussion instruments and the use of typing gestures in contemporary performance practice. The paper then documents the author’s audio-visual system that is driven by the typing of text/lyrics in real-time. It is argued that the system promotes the sensation of liveness through clear, perceptible links between the performer’s gestures, the system’s audio outputs and the its visual outputs. The system also provides a novel approach to the use of generative techniques in the composition and live performance of songs. Future developments would include the use of dynamic text effects linked to sound generation and greater interaction between human performer and the visuals.
keywords: Text, typing, computer keyboard, live performance, Max, system
Item Type: | Article |
---|---|
Additional Information: | Kafka-Esque explores how the computer keyboard can be implemented into an interactive system for the performing music with lyrics by replacing sung lyrics with visually-projected typed text. Composing the piece was central to the research process (Candy and Edmonds, 2018) and involved a cyclical, iterative process of literature review, system-building/composing and reflection. The system builds on previous work in the New Interfaces for Musical Expression (NIME) community that explores the use of QWERTY keyboards for live performance (Fiebrink et al, 2007; Lee et al, 2016) and builds on other works that use typing gestures in live performance, such as Anderson’s The Typewriter (1953) and Reich and Korot’s The Cave (1994). Unlike these works, Kafka-Esque reveals connections between the act of singing and that of typing, while demonstrating how typing gestures can be captured and processed in several ways to create a multi-timbral audio-visual work. The practice also suggests techniques and strategies for implementation in popular music contexts, which are typically under-represented in work with interactive systems (Marchini et al, 2017). These findings are disseminated in the related NIME paper (Waite, 2015). Furthermore, live performances of Kafka-Esque demonstrate high levels of several aspects of liveness (Sanden, 2013). Findings have been shared with international academic and professional audiences at Innovations in Music 2017 (London); Tracking the Creative Process in Music 2017 (Huddersfield) and Loop 2017 (Berlin). The piece was the subject of a NIME 2015 paper and demonstration (Baton Rouge, USA) and was discussed in an Artist Statement in the Leonardo Music Journal (2014). Recordings of the piece and accompanying commentary have been published online and the piece has been performed at Sonorities 2015 (Queen’s University), MTI concerts (De Montfort University) and NoiseFloor (Staffordshire University). The software created for the piece is available for free download. |
Uncontrolled Keywords: | INCL |
Faculty: | Previous Faculty of Arts and Creative Technologies > Film, Sound and Vision |
Event Title: | NIME 2015 |
Event Location: | Baton Rouge, USA |
Event Dates: | 30th May - 3rd June |
Depositing User: | Si WAITE |
Date Deposited: | 12 May 2015 10:57 |
Last Modified: | 24 Feb 2023 13:42 |
Related URLs: | |
URI: | https://eprints.staffs.ac.uk/id/eprint/2089 |