Toward a Minimal Representation of Affective Gestures

  • Authors:
  • Donald Glowinski;Nele Dael;Antonio Camurri;Gualtiero Volpe;Marcello Mortillaro;Klaus Scherer

  • Affiliations:
  • University of Genoa, Genoa;Swiss Center for Affective Sciences, Geneva;University of Genoa, Genoa;University of Genoa, Genoa;Swiss Center for Affective Sciences, Geneva;Swiss Center for Affective Sciences, Geneva

  • Venue:
  • IEEE Transactions on Affective Computing
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this framework. Twelve emotions expressed by 10 actors form the selected data set of emotion portrayals. Visual tracking of trajectories of head and hands were performed from a frontal and a lateral view. Postural/shape and dynamic expressive gesture features were identified and analyzed. A feature reduction procedure was carried out, resulting in a 4D model of emotion expression that effectively classified/grouped emotions according to their valence (positive, negative) and arousal (high, low). These results show that emotionally relevant information can be detected/measured/obtained from the dynamic qualities of gesture. The framework was implemented as software modules (plug-ins) extending the EyesWeb XMI Expressive Gesture Processing Library and is going to be used in user centric, networked media applications, including future mobiles, characterized by low computational resources, and limited sensor systems.