Influence of contextual information in emotion annotation for spoken dialogue systems

  • Authors:
  • Zoraida Callejas;Ramón López-Cózar

  • Affiliations:
  • Department of Languages and Computer Systems, Faculty of Computer Science and Telecommunications, University of Granada, C/Periodista Daniel Saucedo Aranda s/n, 18071 Granada, Spain;Department of Languages and Computer Systems, Faculty of Computer Science and Telecommunications, University of Granada, C/Periodista Daniel Saucedo Aranda s/n, 18071 Granada, Spain

  • Venue:
  • Speech Communication
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we study the impact of considering context information for the annotation of emotions. Concretely, we propose the inclusion of the history of user-system interaction and the neutral speaking style of users. A new method to automatically include both sources of information has been developed making use of novel techniques for acoustic normalization and dialogue context annotation. We have carried out experiments with a corpus extracted from real human interactions with a spoken dialogue system. Results show that the performance of non-expert human annotators and machine-learned classifications are both affected by contextual information. The proposed method allows the annotation of more non-neutral emotions and yields values closer to maximum agreement rates for non-expert human annotation. Moreover, automatic classification accuracy improves by 29.57% compared to the classical approach based only on acoustic features.