In lieu of an abstract, here is a brief excerpt of the content:

  • Simulated Aesthetics and Evolving Artworks:A Coevolutionary Approach
  • Gary R. Greenfield (bio)
Abstract

The application of artificial-life principles for artistic use has its origins in the early works of Sommerer and Mignonneau, Sims and Latham. Most of these works are based on simulated evolution and the determination of fitness according to aesthetics. Of particular interest is the use of evolving expressions, which were first introduced by Sims. The author documents refinements to the method of evolving expressions by Rooke, Ibrahim, Musgrave, Unemi, himself and others. He then considers the challenge of creating autonomously evolved artworks on the basis of simulated aesthetics. The author surveys what little is known about the topic of simulated aesthetics and proceeds to describe his new coevolutionary approach modeled after the interaction of hosts and parasites.

It would be prohibitive to attempt a comprehensive survey of all the artistic endeavors that have been influenced or inspired by artificial-life principles, both for reasons of space and because of the difficulty of documenting so many of the works that have been exhibited. Any such survey, however, should bring to light two important themes: (1) the incorporation of emergent behaviors into artistic works and (2) the exploration of simulated evolution for artistic purposes. The theme of emergent behavior forms the cornerstone for many interactive works, including the installations of Christa Sommerer and Laurent Mignonneau [1] and Rebecca Allen [2]. Such works may possibly trace their origins to the MIT Media Lab ALIVE project [3]. Fueled by rapid advances in autonomous robotics, the growth of the gaming industry and the popularity of such toys as Tomagotchi and Furbies, so-called behavior engines and their emergent behaviors continue to make their presence felt in the world of fine art. On the other hand, the exploration of simulated evolution in the fine arts has not received such emphasis. I wish to survey its origins and development in greater detail.

Michael Tolson, co-founder of the digital-effects company Xaos Tools, won the prestigious 1993 Prix Ars Electronica award for his series of still images entitled Founder's Series. The series was computer generated with the aid of evolved neural nets. Since Tolson's software was proprietary, details of precisely how this was done are fragmentary. In print, Tolson described his method as applying the genetic algorithm to populations of neural nets in order to breed intelligent brushes [4]. This is slightly misleading. In fact, Tolson's neural nets were released onto background images, where they could sense and react to cues introduced by the artist. By responding to such cues, areas of the background image could be modified according to the brush procedures the neural nets had been bred to implement. As a SIGGRAPH panelist, Tolson showed videotape of the breeding stages of a population of neural nets that were trained to be photosensitive. The tape revealed that the neural nets were set in random motion on the surface of a background image. When patches of pure white were added to the image as cues, the photosensitive neural nets would streak toward these patches, dragging along underlying image colors [5]. Tolson's efforts seem not to have been duplicated, however [6].

A second example involving evolution of neural nets is Henrik Lund's Artificial Painter [7]. It has a much different flavor: Each neural net is coded as a bit string so that the genetic algorithm can be applied to assist in the simulated evolution. In this case, however, the computer-generated image is obtained from just one neural net by mapping the net's output response to a color at every cell (i.e. every pixel) of the background image, thus using the entire background image.

In order to gain some understanding of the principles underlying the simulated evolution of populations of neural nets, we must turn to techniques that originated with Richard Dawkins.

Dawkins introduced the fundamental concept of user-guided evolution in a seminal paper given at the first Artificial


Click for larger view
View full resolution
Fig. 1.

An example of a phenotype generated from the genotype of a binary basis function. (© Gary Greenfield) This basis function (after Maeda) is defined on the unit square. Its...

pdf

Share