Correlation between stimulated emotion extracted from EEG and its manifestation on facial expression

  • Authors:
  • A. Chakraborty;P. Bhowmik;S. Das;A. Halder;A. K. Nagar

  • Affiliations:
  • Dept. of Computer Science and Engineering, St. Thomas' College of Engineering and Technology, Calcutta and;Dept. of Electronics and Tele-Communication Engg., Jadavpur University, Calcutta;Dept. of Electronics and Tele-Communication Engg., Jadavpur University, Calcutta;Dept. of Electronics and Tele-Communication Engg., Jadavpur University, Calcutta;Dept. of Computer Science, Liverpool Hope University, Liverpool, UK

  • Venue:
  • SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Determining correlation between aroused emotion and its manifestation on facial expression, voice, gesture and posture have interesting applications in psychotherapy. A set of audiovisual stimulus, selected by a group of experts, is used to excite emotion of the subjects. EEG and facial expression of the subjects excited by the selected audio-visual stimulus are collected, and the nonlinear-correlation from EEG to facial expression, and viceversa is obtained by employing feed-forward neural network trained with back-propagation algorithm. Experiments undertaken reveals that the trained network can reproduce the correlated EEG-facial expression trained instances with 100 % accuracy, and is also able to predict facial expression (EEG) from unknown EEG (facial expression) of the same subject with an accuracy of around 95.2%.