- About This Issue
This issue of Computer Music Journal presents recent work in a field of ongoing interest to our readers: the development and analysis of human interfaces for performing and interacting with computer music. Without exception, the authors of this issue's articles strive to reach broad conclusions of general utility, not just to detail the idiosyncrasies of a particular device.
David Wessel and Matt Wright discuss how to make the computer approach traditional musical instruments in expressivity and responsiveness. To achieve low latency (under 10 msec) and low variation of latency (under 1 msec), the authors represent gestural control signals as audio. They also emphasize the value of compelling metaphors for control, including various mappings of sound into two-dimensional space. The authors present other criteria; for example, a desirable interface welcomes novices with its ease of use without precluding the development of a virtuosic technique. The article describes specific technological developments toward these goals.
Sergi Jordà's article stresses the importance of designing the controller and the sound generation together. He argues that traditional instruments frequently benefit from a bidirectional interaction between the control mechanism and the sounding mechanism, and that digital instruments, which are typically limited to a unidirectional flow of information from controller to generator, can be enriched by incorporating this sort of complexity. Acoustic nonlinearity and haptic feedback are two examples of such complexity, but acoustic information can also in a sense be fed back visually to a graphical user interface, as in the author's software application, FMOL. Mr. Jordà describes the evolution and architecture of this software, which in the newest version permits online collaborative performance.
The next two articles focus on interfaces for the nonmusician. Ryan Ulyate and David Bianciardi describe the lessons they learned from installing a multi-sensor system in a dance club. The participants' body motions affected the music as well as lighting and projections. Similarly, Dominic Robson's article traces the evolution of some musical interfaces he developed that had playfulness and ease of use as paramount considerations. The work culminated in an installation where attendees could collaboratively strike pads and stretch rubber sheets to affect pitches, rhythms, reverberation, and granulation of sound.
The final article, by Marcelo Wanderley and Nicola Orio, studies methodologies for evaluating controllers. The authors review the relevant technical research in the field of human-computer interfaces (HCI) and consider how to apply it to musical interfaces. Formalizing the evaluation of musical controllers, the authors believe, will aid both the designers of such devices and the musicians who need to understand and compare the devices' capabilities.
The articles in this issue are all based on work that was presented at the workshop "New Interfaces for Musical Expression" (NIME), held in Seattle in April 2001 as part of the annual conference of the Special Interest Group on Computer-Human Interaction (SIGCHI) of the Association for Computing Machinery (ACM). We appreciate the efforts of NIME's organizers, particularly Michael Lyons, who encouraged the authors to submit manuscripts to Computer Music Journal and who helped review the work. More information about the NIME workshop can be found in this issue's News section.
The reviews in this issue critique two festivals of electroacoustic music, a biography of Leon Theremin, and various recordings. British acousmatic composers are given especial consideration in the collection of compact discs reviewed here, while the multimedia discs feature two American composers of electroacoustic music. [End Page 1]