Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos

  • Authors:
  • Sander Koelstra;Ashkan Yazdani;Mohammad Soleymani;Christian Mühl;Jong-Seok Lee;Anton Nijholt;Thierry Pun;Touradj Ebrahimi;Ioannis Patras

  • Affiliations:
  • Department of Electronic Engineering, Queen Mary University of London;Ecole Polytechnique Fédérale de Lausanne;Computer Vision and Multimedia Laboratory, University of Geneva;Human Media Interaction Group, University of Twente;Ecole Polytechnique Fédérale de Lausanne;Human Media Interaction Group, University of Twente;Computer Vision and Multimedia Laboratory, University of Geneva;Ecole Polytechnique Fédérale de Lausanne;Department of Electronic Engineering, Queen Mary University of London

  • Venue:
  • BI'10 Proceedings of the 2010 international conference on Brain informatics
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, the field of automatic recognition of users. affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of their EEG activity. We present methods for single trial classification using both EEG and peripheral physiological signals. For EEG, an average (maximum) classification rate of 55.7% (67.0%) for arousal and 58.8% (76.0%) for valence was obtained. For peripheral physiological signals, the results were 58.9% (85.5%) for arousal and 54.2% (78.5%) for valence.