In lieu of an abstract, here is a brief excerpt of the content:

  • klipp av:Live Algorithmic Splicing and Audiovisual Event Capture
  • Nick Collins and Fredrik Olofsson

Recent new media concerts feature a trend toward the fuller integration of modalities enabled by close audiovisual collaborations, avoiding the sometimes artificial separation of disc jockey (DJ) and video jockey (VJ), of audio and visual artist. Integrating audio and visual domains has been an artistic concern from the experimental films of such notaries as Oskar Fischinger and Norman McClaren earlier in the 20th century, through 1960s happenings, 1970s analog video synthesizers, and 1980s pop videos, to the current proliferation of VJing, DVD labels, and live cinema (Lew 2004). The rise of the VJ has been allied with the growth in club culture since the 1980s, with Super-8 film and video projectionists at early raves now replaced by "laptopists" armed with commercial digital VJ software like Isadora, Aestesis, Motion Dive, and Arkaos VJ. (An extensive list is maintained at www.audiovisualizers.com.)

In much current practice, where a VJ accompanies fixed (pre-recorded) audio, correlation in mapping is usually achieved via a simple spectral analysis of the overall output sound. Graphical objects can be controlled using a downsampled energy envelope in a frequency band. Yet this is a crude solution for live generated audio; fine details in the creation of audio objects should be accessible as video controls. Analogous to the sampling culture within digital music, source material for visual manipulation is often provided by pre-prepared footage or captured live with digital cameras. Synthesis also provides an option for the creation of imagery, and generative graphics are a further staple. Modern programs integrate many different possible sources and effects processes in software interfaces, with external control from MIDI, Open Sound Control (OSC; Wright and Freed 1997), and Universal Serial Bus (USB) devices.

Live performance has seen the development of MIDI-triggering software for video clips such as EBN's Video Control System and Coldcut's VJamm (www.vjamm.com), and turntable tracking devices applied as control interfaces to video playback like Final Scratch (www.finalscratch.com) and MsPinky (www.mspinky.com). The influential audiovisual sampling group Coldcut performs live by running precomposed or keyboard-performed MIDI sequences from Ableton Live as control inputs to their VJamm software, triggering simultaneous playback of video clips with their soundtracks. They have not explored, however, the use of captured audio and video, nor the real potential of algorithmic automation of such effects. Techniques will be described in this article that make this natural step.

Customizable graphical programming languages like Max/MSP (with the nato.0+55, softVNS2, and Jitter extensions), Pure Data (Pd, with GEM, PDP, and GridFlow extensions), or jMax (with the DIPS, or Digital Image Processing with Sound, package of Matsuda et al. 2002) can cater to those who see no a priori separation of the modalities and wish to generate both concurrently, from a common algorithm or with some form of internal message passing. Other authors choose to define their own protocols for information transfer between modality-specific applications (Betts 2002; Collins and Olofsson 2003), perhaps using a network protocol like OSC to connect laptops.

The heritage of the VJ is somewhat independent of another tradition that has combined music and image, namely film. It is important to be cautious about the direct application of film music theory, particularly wherever the subservience of non-diegetic music is trumpeted. The emotional underscoring of narrative concerns in typical orchestral film music is certainly not the state of play in a club [End Page 8] environment! As Nicholas Cook notes (1998), it may be more correct to speak of music film in many cases of multimedia, and the emphasis is certainly this way around in the history of VJ performance. In an artistic sphere, however, the marriage of sound and vision provides potential not just for direct "mickey mousing" (the absolute coincidence of sound and visual action, originally pertaining to animated characters), but also more subtle synchronizations, contrasts of pacing, and even combative opposition of domains.

The mapping between modalities can be a central concern of audiovisual collaboration, even as a theme of live improvisation.

klipp av (Swedish for "cut apart") is an audiovisual performance duo that has been...

pdf

Additional Information

ISSN
1531-5169
Print ISSN
0148-9267
Pages
pp. 8-18
Launched on MUSE
2006-07-03
Open Access
No
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.