Premium
Modeling human affective postures: an information theoretic characterization of posture features
Author(s) -
De Silva P. Ravindra,
BianchiBerthouze Nadia
Publication year - 2004
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.29
Subject(s) - gesture , computer science , set (abstract data type) , facial expression , mood , artificial intelligence , dance , motion (physics) , emotion classification , cognitive psychology , psychology , social psychology , art , literature , programming language
One of the challenging issues in affective computing is to give a machine the ability to recognize the mood of a person. Efforts in that direction have mainly focused on facial and oral cues. Gestures have been recently considered as well, but with less success. Our aim is to fill this gap by identifying and measuring the saliency of posture features that play a role in affective expression. As a case study, we collected affective gestures from human subjects using a motion capture system. We first described these gestures with spatial features, as suggested in studies on dance. Through standard statistical techniques, we verified that there was a statistically significant correlation between the emotion intended by the acting subjects, and the emotion perceived by the observers. We used Discriminant Analysis to build affective posture predictive models and to measure the saliency of the proposed set of posture features in discriminating between 4 basic emotional states: angry, fear, happy, and sad. An information theoretic characterization of the models shows that the set of features discriminates well between emotions, and also that the models built over‐perform the human observers. Copyright © 2004 John Wiley & Sons, Ltd.