In lieu of an abstract, here is a brief excerpt of the content:

Sign Language Studies 2.1 (2001) 116-131



[Access article in PDF]

Book Review

Gesture, Speech and Sign


Gesture, Speech and Sign, by Lynn Messing and Ruth Campbell, editors (Oxford: Oxford University Press, 1999, xxv, 227 pp., cloth, $85.00)

Gesture, Speech and Sign is an edited collection of eleven papers, including a preface and an introduction. It is organized into three parts: the Neurobiology of Human Communication; the Relationships among Speech, Signs, and Gestures; and Epilogue: A Practical Application. In the preface the editors define the scope of the book as “a genuinely interdisciplinary scientific study of gesture,” thus implying that the book is about gesture in relation to sign and speech. Most of the contributors refer to Hand and Mind: What Gestures Reveal about Thought by David McNeill (1992).

Synopsis

Preface. Messing and Campbell explain the difference between action and gesture, arguing that “gestures work communicatively in ways that other actions do not” and operate under “constraints of a psychological and cognitive nature” (ix). Invoking linguistic, psycholinguistic, and neurological evidence, Messing and Campbell present the view that signed languages (SLs) are not mere gestures, arguing that they are full-fledged languages on par with their verbal counterparts. They explain, however, that the reason that the investigation of gestures in SLs has been lagging behind has to do with the fear that SLs might lose their status as “fully formed languages.” On the [End Page 116] other hand, although speech-accompanying gestures (S-AGs) have not been in such disgrace, research in the area is fairly scant as gestures have been considered in linguistics as peripheral to speech.

Introduction: An Introduction to Signed Languages. The introduction provides background information on SLs. Messing emphasizes that American Sign Language (ASL) is a full-fledged language on a par with spoken ones and notes that the degree of similarity between two SLs cannot be determined by the similarities between their spoken counterparts. Quoting Valli and Lucas (1992), Messing shows that ASL has most of the foundational criteria of a language, namely, productivity, displacement, pragmatic variety, role switching between addresser and addressee, signer self-correction, exposure to sign data, metalinguistic dimension, and so on. Apart from ASL, we are informed, the United States counts a number of visual communication systems such as Manually Coded English (MCE), contact signing, fingerspelling, Simultaneous Communication, and Cued Speech.

The Neurobiology of Human Communication

Neuropsychology of Communicative Movements. Feyereisen is interested in the way movements are transformed into meanings, and his objective is to review neuropsychological theories in order to determine the different components of human manual communication. The author invokes Kimura (1993) and Lieberman (1991, 1992) as emphasizing that vocalization (language) and manual activity (action) arise from the same neural systems. However, recent research on the control of action in neuropsychology suggests that the execution of limb movements is preceded by mental representations of the movements and that those representations involve a multiplicity of tasks such as transporting, catching, and manipulating, which depend on different cerebral mechanisms. A lesion such as optic ataxia affects the control of hand direction in space, leaving intact patients’ visual and tactile capacities to identify objects and touch various parts of their body. Patients affected by visual agnosia, on the other hand, have no trouble controlling hand movement in space but suffer from impairment to recognize objects visually, which suggests that there are “at least [End Page 117] two visual cortical pathways, one for reaching . . . and one for recognizing objects” (6). Other evidence supporting the specialization of different neural systems in action and perception can be found in research on seeing a gesture to imitate it and seeing it to understand it, where meaningful gestures activate the left hemisphere and meaningless ones activate regions in the right hemisphere, although “the systems for gesture imitation and recognition are not independent” (8).

The second section of the paper deals with apraxia—a disorder of voluntary action. In neuropsychology, the assessment of apraxia goes by three elicitation techniques: verbal command, imitation, and actual use of objects. If apraxia affects manual and facial movements, it is not clear whether limb and buccofacial apraxia originate...

pdf