Emotion detection in task-oriented spoken dialogues

  • Authors:
  • L. Devillers;L. Lamel;I. Vasilescu

  • Affiliations:
  • LIMSI, CNRS, Orsay, France;LIMSI, CNRS, Orsay, France;Inst. for Human-Machine Commun., Munich Univ. of Technol., Munchen, Germany

  • Venue:
  • ICME '03 Proceedings of the 2003 International Conference on Multimedia and Expo - Volume 3 (ICME '03) - Volume 03
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Detecting emotions in the context of automated call center services can be helpful for following the evolution of the human-computer dialogues, enabling dynamic modification of the dialogue strategies and influencing the final outcome. The emotion detection work reported here is a part of larger study aiming to model user behavior in real interactions. We make use of a corpus of real agent-client spoken dialogues in which the manifestation of emotion is quite complex, and it is common to have shaded emotions since the interlocutors attempt to control the expression of their internal attitude. Our aims are to define appropriate emotions for call center services, to annotate the dialogues and to validate the presence of emotions via perceptual tests and to find robust cues for emotion detection. In contrast to research carried out with artificial data with simulated emotions, for real-life corpora the set of appropriate emotion labels must be determined. Two studies are reported: the first investigates automatic emotion detection using linguistic information, whereas the second concerns perceptual tests for identifying emotions as well as the prosodic and textual cues which signal them. About 11% of the utterances are annotated with non-neutral emotion labels. Preliminary experiments using lexical cues detect about 70% of these labels.