Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli

  • Authors:
  • Christos A. Frantzidis;Charalampos Bratsas;Christos L. Papadelis;Evdokimos Konstantinidis;Costas Pappas;Panagiotis D. Bamidis

  • Affiliations:
  • Laboratory of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece;Laboratory of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece;Center for Mind/Brain, University of Trento, Trento, Italy;Laboratory of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece;Laboratory of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece;Laboratory of Medical Informatics, Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece

  • Venue:
  • IEEE Transactions on Information Technology in Biomedicine - Special section on new and emerging technologies in bioinformatics and bioengineering
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a methodology for the robust classification of neurophysiological data into four emotional states collected during passive viewing of emotional evocative pictures selected from the International Affective Picture System. The proposed classification model is formed according to the current neuroscience trends, since it adopts the independency of two emotional dimensions, namely arousal and valence, as dictated by the bidirectional emotion theory, whereas it is gender-specific.Atwo-step classification procedure is proposed for the discrimination of emotional states between EEG signals evoked by pleasant and unpleasant stimuli, which also vary in their arousal/intensity levels. The first classification level involves the arousal discrimination. The valence discrimination is then performed. The Mahalanobis (MD) distance-based classifier and support vector machines (SVMs) were used for the discrimination of emotions. The achieved overall classification rates were 79.5% and 81.3% for the MD and SVM, respectively, significantly higher than in previous studies. The robust classification of objective emotional measures is the first step toward numerous applications within the sphere of human-computer interaction.