Multimodal emotion classification in naturalistic user behavior

  • Authors:
  • Steffen Walter;Stefan Scherer;Martin Schels;Michael Glodek;David Hrabal;Miriam Schmidt;Ronald Böck;Kerstin Limbrecht;Harald C. Traue;Friedhelm Schwenker

  • Affiliations:
  • Medical Psychology, University of Ulm, Germany;Institute of Neural Information Processing, University of Ulm, Germany;Institute of Neural Information Processing, University of Ulm, Germany;Institute of Neural Information Processing, University of Ulm, Germany;Medical Psychology, University of Ulm, Germany;Institute of Neural Information Processing, University of Ulm, Germany;Otto von Guericke University Magdeburg, Germany;Medical Psychology, University of Ulm, Germany;Medical Psychology, University of Ulm, Germany;Institute of Neural Information Processing, University of Ulm, Germany

  • Venue:
  • HCII'11 Proceedings of the 14th international conference on Human-computer interaction: towards mobile and intelligent interaction environments - Volume Part III
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

The design of intelligent personalized interactive systems, having knowledge about the user's state, his desires, needs and wishes, currently poses a great challenge to computer scientists. In this study we propose an information fusion approach combining acoustic, and biophysiological data, comprising multiple sensors, to classify emotional states. For this purpose a multimodal corpus has been created, where subjects undergo a controlled emotion eliciting experiment, passing several octants of the valence arousal dominance space. The temporal and decision level fusion of the multiple modalities outperforms the single modality classifiers and shows promising results.