Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics

  • Authors:
  • Aurélie Zara;Valérie Maffiolo;Jean Claude Martin;Laurence Devillers

  • Affiliations:
  • France Telecom Orange Labs, 2 av. P. Marzin, 22300 Lannion, France and LIMSI-CNRS, BP 133, 91403 Orsay cedex, France;France Telecom Orange Labs, 2 av. P. Marzin, 22300 Lannion, France;LIMSI-CNRS, BP 133, 91403 Orsay cedex, France;LIMSI-CNRS, BP 133, 91403 Orsay cedex, France

  • Venue:
  • ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to design affective interactive systems, experimental grounding is required for studying expressions of emotion during interaction. In this paper, we present the EmoTaboo protocol for the collection of multimodal emotional behaviours occurring during human-human interactions in a game context. First annotations revealed that the collected data contains various multimodal expressions of emotions and other mental states. In order to reduce the influence of language via a predetermined set of labels and to take into account differences between coders in their capacity to verbalize their perception, we introduce a new annotation methodology based on 1) a hierarchical taxonomy of emotion-related words, and 2) the design of the annotation interface. Future directions include the implementation of such an annotation tool and its evaluation for the annotation of multimodal interactive and emotional behaviours. We will also extend our first annotation scheme to several other characteristics interdependent of emotions.