Affective computation on EEG correlates of emotion from musical and vocal stimuli

  • Authors:
  • Reza Khosrowabadi;Abdul Wahab;Kai Keng Ang;Mohammad H. Baniasad

  • Affiliations:
  • Centre for Computational Intelligence, School of Computer Engineering, Nanyang Technological University, Singapore;Centre for Computational Intelligence, School of Computer Engineering, Nanyang Technological University, Singapore;Institute for Infocomm Research, Agency for Science, Technology and Research, Connexis, Singapore;Department of Psychiatry, Lavasani Psychiatric Hospital, Tehran medical branch of Azad University, Iran

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Affective interface that acquires and detects the emotion of the user can potentially enhance the human-computer interface experience. In this paper, an affective brain-computer interface (ABCI) is proposed to perform affective computation on electroencephalogram (EEG) correlates of emotion. The proposed ABCI extracts EEG features from subjects while exposed to 6 emotionally-related musical and vocal stimuli using kernel smoothing density estimation (KSDE) and Gaussian mixture model probability estimation (GMM). A classification algorithm is subsequently used to learn and classify the extracted EEG features. An inter-subject validation study is performed on healthy subjects to assess the performance of ABCI using a selection of classification algorithms. The results show that ABCI that employed the Bayesian network and the One-Rule classifier yielded a promising inter-subject validation accuracy of 90%.