Levels of representation in the annotation of emotion for the specification of expressivity in ECAs

  • Authors:
  • Jean-Claude Martin;Sarkis Abrilian;Laurence Devillers;Myriam Lamolle;Maurizio Mancini;Catherine Pelachaud

  • Affiliations:
  • LIMSI-CNRS, Orsay Cedex, France;LIMSI-CNRS, Orsay Cedex, France;LIMSI-CNRS, Orsay Cedex, France;LINC, IUT de Montreuil, Université Paris, France;LINC, IUT de Montreuil, Université Paris, France;LINC, IUT de Montreuil, Université Paris, France

  • Venue:
  • Lecture Notes in Computer Science
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a two-steps approach towards the creation of affective Embodied Conversational Agents (ECAs): annotation of a real-life nonacted emotional corpus and animation by copy-synthesis. The basis of our approach is to study how coders perceive and annotate at several levels the emotions observed in a corpus of emotionally rich TV video interviews. We use their annotations to specify the expressive behavior of an agent at several levels. We explain how such an approach can be useful for providing knowledge as input for the specification of non-basic patterns of emotional behaviors to be displayed by the ECA (e.g. which perceptual cues and levels of annotation are required for enabling the proper recognition of the emotions).