Person-independent estimation of emotional experiences from facial expressions

  • Authors:
  • Timo Partala;Veikko Surakka;Toni Vanhala

  • Affiliations:
  • University of Tampere, Finland;University of Tampere, Finland and Tampere University Hospital, Tampere, Finland;University of Tampere, Finland

  • Venue:
  • Proceedings of the 10th international conference on Intelligent user interfaces
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The aim of this research was to develop methods for the automatic person-independent estimation of experienced emotions from facial expressions. Ten subjects watched series of emotionally arousing pictures and videos, while the electromyographic (EMG) activity of two facial muscles: zygomaticus major (activated in smiling) and corrugator supercilii (activated in frowning) was registered. Based on the changes in the activity of these two facial muscles, it was possible to distinguish between ratings of positive and negative emotional experiences at a rate of almost 70% for pictures and over 80% for videos. Using these methods, the computer could adapt its behavior according to the user's emotions during human-computer interaction.