Evaluation and Discussion of Multi-modal Emotion Recognition

  • Authors:
  • Ahmad Rabie;Britta Wrede;Thurid Vogt;Marc Hanheide

  • Affiliations:
  • -;-;-;-

  • Venue:
  • ICCEE '09 Proceedings of the 2009 Second International Conference on Computer and Electrical Engineering - Volume 01
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recognition of emotions from multimodal cues is of basic interest for the design of many adaptive interfaces in human-machine and human-robot interaction. It provides a means to incorporate non-verbal feedback in the interactional course. Humans express their emotional state rather unconsciously exploiting their different natural communication modalities. In this paper, we present a first study on multimodal recognition of emotions from auditive and visual cues for interaction interfaces. We recognize seven classes of basic emotions by means of visual analysis of talking faces. In parallel, the audio signal is analyzed on the basis of the intonation of the verbal articulation. We compare the performance of state of the art recognition systems on the DaFEx database for both complement modalities and discuss these results with regard to the theoretical background and possible fusion schemes in real-world multimodal interfaces.