In lieu of an abstract, here is a brief excerpt of the content:

| 151 5 Automated Facial Expression Analysis and the Mobilization of Affect As he recalls it, Joseph Weizenbaum was moved to write his book Computer Power and Human Reason as a result of the public reaction to his experimental language analysis program, named ELIZA, that he developed at MIT in the mid-1960s.1 ELIZA’s incarnation as DOCTOR, a parody version of a psychotherapist asking inane reflexive questions, was wildly misinterpreted as an intelligent system, according to Weizenbaum. Much to his astonishment, some practicing psychiatrists actually thought that the DOCTOR program could be developed into a truly automated form of psychotherapy , “a therapeutic tool which can be made widely available to mental hospitals and psychiatric centers suffering a shortage of therapists.”2 The conscientious computer scientist objected: I had thought it essential, as a prerequisite of the very possibility that one person might help another learn to cope with his [sic] emotional problems , that the helper himself participate in the other’s experience of those problems and, in large part by way of his own empathetic recognition of them, himself come to understand them. . . . That it was possible for even one practicing psychiatrist to advocate that this crucial component of the therapeutic process be entirely supplanted by pure technique—that I had not imagined!3 Although others have since reinterpreted ELIZA in more techno-optimistic terms,4 Weizenbaum’s concern for his simulated psychiatrist’s incapacity for “empathetic recognition” is instructive for a critical analysis of the effort to program computers to “see” the human face. It is especially relevant to a dimension of the research that I consider in this chapter. Whereas up until now I have focused largely on automated facial recognition as a technology of 152 | Automated Facial Expression Analysis identification, here I investigate the development of automated facial expression analysis, the effort to program computers to recognize facial expressions as they form on and move across our faces. While facial recognition technology treats the face as a “blank somatic surface” to be differentiated from other faces as an index of identity, automated facial expression analysis treats the dynamic surface of the face as the site of differentiation.5 The dimensions and intensities of facial movements are analyzed as indices of emotion and cognition, as a means of determining what people are thinking and feeling. Experiments in automated facial expression analysis—or AFEA, as I will refer to it in this chapter—represent a subfield of computer vision research, overlapping but often distinct from experiments in automated facial recognition .6 The automation of facial expression analysis promises to accomplish what facial recognition technology fails to do: read the interior of the person off the surface of the face, using the face itself as a field of classifiable information about the individual. The issues that Weizenbaum’s program raised about the possibility of using computers to perform some of the intellectual labor of psychotherapy, as well as the ethical implications of doing so, arise again in the case of automated facial expression analysis. The project of AFEA is tightly bound to the field of psychology: psychological theories of facial expression and emotion inform the design of AFEA systems, and those systems in turn promise to advance the field’s knowledge of facial expressions and emotion. Facial expression analysis is one approach to studying the psychophysiology of emotion, among an array of techniques for measuring the physiological manifestations of the affects. In this sense, there are important differences between the ELIZA program and the project of AFEA. For its part, ELIZA was not a computer vision program but a language analyzer designed to automatically manipulate text that a user consciously typed into a computer, the program drawing from a script to simulate the questions and responses of a psychotherapist. No one claimed that ELIZA could read the faces or emotions of the people with whom “she” was interacting. In fact, physiologists interested in emotion would have little use for ELIZA, long asserting “the poverty of human language to represent emotions” and a preference for examining affects in their pre-linguistic phase rather than relying on the emotional self-awareness of human subjects.7 Like a long line of techniques before it, AFEA promises to bring this pre-linguistic phase of human emotion into greater visibility, irrespective of the conscious reflection of human subjects about their emotional states. Beyond the methodological issues it raises for the psychological sciences, the idea that an automated form of facial expression analysis might replace...

Share