In lieu of an abstract, here is a brief excerpt of the content:

  • Editor's Notes
  • Michael Gurevich, Guest Editor

Supplementary Material

• Podcast: Michael Gurevich on Human-Computer Interaction (mp3 9,250 KB)

Michael Gurevich, lecturer at the Sonic Arts Research Center at Queenís University, Belfast, serves as guest editor of the Winter 2010 issue of Computer Music Journal. In this podcast, Michael discusses the fields of Computer Music and Human Computer Interaction (HCI). He describes how these fields intersect and what they can learn from each other, touching on how the field of Computer Music has grown and how this affects performance and composition of electronic music. This conversation was recorded on December 14, 2010.

Computer music has arrived. From the field's inception, we had to struggle with technology just to make our music happen. Out of necessity, our pioneers developed some of the earliest mixing consoles to diffuse sound in space, devised vacuum-tube digital-to-analog converters to hear the very first digital sound samples, created efficient synthesis and digital signal processing algorithms to manipulate sound in real time, designed low-latency hardware and software in order to coordinate multiple sound-making devices—the list goes on. But we have now won most of these battles, and have entered an age in which many of our fundamental challenges are no longer necessarily technological ones; rather, we are faced with the questions of how to effectively and creatively organize, assemble, and employ the dizzying array of technologies now at our disposal.

We can draw an interesting parallel to the domain of mainstream office computing. By the mid 1980s, the technologies upon which the desktop computing paradigm is based had largely codified; indeed, the desktop office computer of today, with all of its peripherals, does not look much different than it did then. And it was around that time that the field of human-computer interaction (HCI) burgeoned in earnest. Questions of how best to wrap these technologies into novel products to yield the most efficient, reliable, and satisfying results came to the forefront. HCI gave us this new perspective, as well as sets of methodologies, frameworks, and practices with which to go about addressing these questions.

Being more technologically demanding, the field of computer music has taken a bit longer to reach a similar point in our history, but we have arrived. Of course, plenty of technological work remains to be done; however, many of the platforms we use have largely stabilized, and inexpensive, readily available software and hardware components are sufficiently fast and reliable to accomplish much of what we want to do. But unlike mainstream office computing, it is only in the past five to ten years that we have been able to say with some confidence that we largely have all the widgets and doodads we need. And it is precisely in that time that researchers and practitioners have begun to be concerned, in significant numbers, with issues of HCI that are specific to music creation. This new focus is evident in the dramatic rise of the Conference on New Interfaces for Musical Expression (NIME), which has just marked its tenth anniversary, and indeed in the diversity and quality of the articles in this special issue of Computer Music Journal, not to mention an overwhelming number of strong submissions that could not be included.

This issue begins with an interview conducted by Gascia Ouzounian with Paul DeMarinis, the keynote speaker at NIME 2009 at Carnegie Mellon University. Paul DeMarinis was himself an innovator in the technologies and techniques of interactive music and sound, but his work also often celebrates the early pioneers of science and technology—known, obscure, famous, and forgotten—situating their visions in fanciful and poignant contexts. His art twists technology as subject and object; it is as much about technology as it is of technology, and in the spirit of this issue it prompts us to reflect on our relationships with the fundamental forces and media on which much of our work relies.

NIME has been largely concerned with hardware interfaces and physical interaction, to the extent that it seems software is sometimes neglected. Two articles in this issue therefore came as pleasant surprises, as they investigate interactions with music software in two...

pdf

Share