An Integrated Approach to Emotion Recognition for Advanced Emotional Intelligence

  • Authors:
  • Panagiotis D. Bamidis;Christos A. Frantzidis;Evdokimos I. Konstantinidis;Andrej Luneski;Chrysa Lithari;Manousos A. Klados;Charalambos Bratsas;Christos L. Papadelis;Costas Pappas

  • Affiliations:
  • Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece;Center for Mind/Brain (CIMEC), University of Trento, Mattarello, Italy;Lab of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Greece

  • Venue:
  • Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Emotion identification is beginning to be considered as an essential feature in human-computer interaction. However, most of the studies are mainly focused on facial expression classifications and speech recognition and not much attention has been paid until recently to physiological pattern recognition. In this paper, an integrative approach is proposed to emotional interaction by fusing multi-modal signals. Subjects are exposed to pictures selected from the International Affective Picture System (IAPS). A feature extraction procedure is used to discriminate between four affective states by means of a Mahalanobis distance classifier. The average classifications rate (74.11%) was encouraging. Thus, the induced affective state is mirrored through an avatar by changing its facial characteristics and generating a voice message sympathising with the user's mood. It is argued that multi-physiological patterning in combination with anthropomorphic avatars may contribute to the enhancement of affective multi-modal interfaces and the advancement of machine emotional intelligence.