Abstract
An interface is a field of abstractions where two systems interact with one another. We typically use this word for the locus where a human and a machine communicate. The interface can be as simple as an on/off button or it can be multimodal: a mixture of different types of hardware used to input commands into a system that responds through the use of screens, speakers, motors and haptic feedback. The computer is a meta-machine with no natural interface, unlike physical machinery where gears, buttons and wheels are natural extensions of the mechanism itself. This fact is problematized when the computer is used for music, as we have innumerable arbitrary ways of representing an interface to the audio system of the computer. It could be anywhere from a simple play-button to a custom written class that encapsulates the digital signal processing of an audio unit generator. The question here is that of purpose: what intentional bandwidth do we¬—as software designers—give to the users of our system? What degree of control do we provide and which interfaces do we present as affordances of the system, such that the cognitive processes of the user can be reflected in the machine signal?
Original language | English |
---|---|
Title of host publication | The SuperCollider book |
Editors | Scott Wilson, David Cottle, Nick Collins |
Place of Publication | Cambridge, MA |
Publisher | MIT Press |
Pages | 613-629 |
Number of pages | 17 |
ISBN (Print) | 0262232693 |
Publication status | Published - 1 Jan 2011 |
Keywords
- Musical Interfaces
- Music
- Programming
- Open Source Software
- Composition
- Music Pedagogy
- Education