Measuring posture features saliency in expressing affective states

Silva R De, N Bianchi-Berthouze
in IEEE International conference on Intelligent Robotics and Systems, Conference paper (text)


Date: 28 September - 2 October 2004

Abstract: Today, creating systems that are capable of interacting naturally and efficiently with humans on many levels is essential. One step toward achieving this is the recognition of emotion from whole body postures of human partners.
Currently, little research in this area mists in computer
science. Themfore, our aim is to identify and measure the
saliency of posture features that play a role in alleaive
expression. As a case-study, we wUected aITective gestures
from human subjefts using a motion capture system. We
first described these gestures with spatial features. Through
standard statistical techniques, we verified that there was
a slatistidy significant correlation between the emotion
intended by the acting subjects, and the emotion perceived
by the obseners. We d n e d the use of Discriminant
Analysis to measure the saliency of the proposed set of
posture features in discriminating between 4 basic emotions:
angry, fear, happy, and sad. Our results show that the
Set of features discriminates well between emotions, and
also provides evidence about the strong overlap between
descriptors in both acting and observing activities.