Abstract

This study addresses how emotion is conveyed in American Sign Language (ASL) and how it is distributed across the manual and facial channels of expression. Specifically, we examine both the production and perception of emotional expression in the manual channel by asking signers, first to produce sentences in different emotional conditions, and then later to categorize sentences according to the emotions expressed. Findings show that in production, sentences in the negative emotional conditions, sad and angry, exhibit distinctive profiles. In terms of perception, signers are capable of recognizing different emotional states from manual signals alone. Those errors that do occur, however, cluster on dimensions of intensity.

pdf