- Spectral Anticipations
This article deals with relations between randomness and structure in audio and musical sounds. Randomness, in the casual sense, refers to something that has an element of variation or surprise in it, whereas structure refers to something more predictable, rule-based, or even deterministic. When dealing with noise, which is the "purest" type of randomness, one usually adopts the canonical physical or engineering definition of noise as a signal with a white spectrum (i.e., composed of equal or almost-equal energies in all frequencies). This seems to imply that noise is a complex phenomenon simply because it contains many frequency components. (Mathematically, to qualify as random or stochastic process, the density of the frequency components must be such that the signal would have a continuous spectrum, whereas periodic components would be spectral lines or delta functions.)
In contradiction to this reasoning stands the fact that, to our perception, noise is a rather simple signal, and in terms of its musical use, it does not allow much structural manipulation or organization. Musical notes or other repeating or periodic acoustic components in music are closer to being deterministic and could be considered as "structure." However, complex musical signals, such as polyphonic or orchestral music that contain simultaneous contributions from multiple instrumental sources, often have a spectrum so dense that it seems to approach a noise-like spectrum. In such situations, the ability to determine the structure of the signal cannot be revealed by looking at signal spectrum alone. Therefore, the physical definition of noise as a signal with a smooth or approximately continuous spectrum seems to obscure other significant properties of signals versus noise, such as whether a given signal has temporal structure—in other words, whether the signal can be predicted.
The article presents a novel approach to (automatic) analysis of music based on an "anticipation profile." This approach considers dynamic properties of signals in terms of an anticipation property, which is shown to be significant for discrimination of noise versus structure and the characterization and analysis of complex signals. Mathematically, this is formalized in terms of measuring the reduction in the uncertainty about the signal that is achieved when anticipations are formed by the listener.
Considering anticipation as a characteristic of a signal involves a different approach to the analysis of audio and musical signals. In our approach, the signal is no longer considered to be characterized according to its features or descriptors alone, but its characterization takes into account also an observer that operates intelligently on the signal, so that both the information source (the signal) and a listener (information sink) are included as parts of one model. This approach fits very well into an information theoretic framework, where it becomes a characterization of a communication process over a time-channel between the music (present time of the acoustic signal) and a listener (a system that has memory and prediction capabilities based on a signal's past). The amount of structure is equated to the amount of information that is "transmitted" or "transferred" from a signal's past into the present, which depends both on the nature of the signal and the nature of the listening system.
This formulation introduces several important advantages. First, it resolves certain paradoxes related to the use of information theoretic concepts in music. Specifically, it corrects the naïve equation of entropy (or uncertainty) to the amount of "interest" present in the signal. Instead, we are considering the relative reduction in uncertainty caused by prediction/ anticipation. Additionally, the measure has the desired "inverted-U function" behavior that characterizes both signals that are either nearly constant and signals that are highly random as signals that have little structure. In both cases, the reduction in uncertainty owing to prediction is small. In the first case, this is caused by the small amount of variation in the signal to start with, whereas in the second [End Page 63] case, there is little reduction in uncertainty, because prediction has little or no effect. Finally, this formulation also clarifies the difference between determinism and predictability. Signals that have deterministic dynamics (such as certain chaotic signals) might have varying degrees of predictability, depending on the...