In lieu of an abstract, here is a brief excerpt of the content:

  • Toward Robotic Musicianship
  • Gil Weinberg and Scott Driscoll

We present the development of a robotic percussionist named Haile that is designed to demonstrate musicianship. We define robotic musicianship in this context as a combination of musical, perceptual, and interaction skills with the capacity to produce rich acoustic responses in a physical and visual manner. Haile listens to live human players, analyzes perceptual aspects of their playing in real time, and uses the product of this analysis to play along in a collaborative and improvisatory manner. It is designed to combine the benefits of computational power, perceptual modeling, and algorithmic music with the richness, visual interactivity, and expression of acoustic playing. We believe that combining machine listening, improvisational algorithms, and mechanical operations with human creativity and expression can lead to novel musical experiences and outcome. Haile can therefore serve as a test bed for novel forms of musical human– machine interaction, bringing perceptual aspects of computer music into the physical world both visually and acoustically.

This article presents our goals for the project and the approaches we took in design, mechanics, perception, and interaction to address these goals. After an overview of related work in musical robotics, machine musicianship, and music perception, we describe Haile's design, the development of two robotic arms that can strike different locations on a drum with controllable volume levels, applications developed for low- and high-level perceptual listening and improvisation, and two inter-active compositions for humans and a robotic percussionist that use Haile's capabilities. We conclude with a description of a user study that was conducted in an effort to evaluate Haile's perceptual, mechanical, and interaction functionalities. The results of the study showed significant correlation between humans' and Haile's rhythmic perception as well as strong user satisfaction from Haile's perceptual and mechanical capabilities. The study also indicated areas for improvement, such as the need for better timbre and loudness control as well as more advanced and responsive interaction schemes.

Goals and Motivation

Most computer-supported interactive music systems are hampered by their inanimate nature, which does not provide players and audiences with physical and visual cues that are essential for creating expressive musical interactions. For example, motion size often corresponds to loudness, and gesture location often relates to pitch. These cues provide visual feedback and help players anticipate and coordinate their playing. They also create a more engaging experience for the audience by providing a visual connection to the sound. Computer-supported interactive music systems are also limited by the electronic reproduction and amplification of sound through speakers, which cannot fully capture the richness of acoustic sound.

Our approach for addressing these limitations is to employ a mechanical apparatus that converts digital musical instructions into physically generated acoustic sound. We believe that musical robots can bring together the unique capabilities of computational power and the expression and richness of acoustic sounds created through physical and visual interaction. A musical robot can combine algorithmic analysis and response capabilities that are not humanly possible with rich sound and visual gestures that cannot be reproduced by loudspeakers. We hope that such novel human–machine interaction can lead to new musical experiences and new music that cannot be conceived by traditional means. The engaging power of musical robots can also be useful as an educational tool, introducing learners not only to music but also to the mathematics, [End Page 28] physics, and technology behind it in an interactive, hands-on manner.

Current research directions in musical robotics focus mostly on sound production and rarely address perceptual aspects of musicianship, such as listening, analysis, improvisation, or group interaction. Such automated devices can be classified in one of two ways: robotic musical instruments, which are mechanical constructions that can be played by live musicians or triggered by pre-recorded sequences (Singer, Larke, and Bianciardi 2003; Jordà 2002; Dannenberg et al. 2005); or anthropomorphic musical robots—hominoid robots that attempt to imitate the action of human musicians (Takanishi and Maeda 1998; Sony 2003; Toyota 2004). Only a few attempts have been made to develop perceptual robots that are controlled by neural networks or other autonomous methods (Baginsky 2004).

The work described in this article addresses our preliminary...

pdf

Share