First demonstration of a musical emotion BCI

  • Authors:
  • Scott Makeig;Grace Leslie;Tim Mullen;Devpratim Sarma;Nima Bigdely-Shamlo;Christian Kothe

  • Affiliations:
  • Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California;Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California;Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California;Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California;Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California;Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, California

  • Venue:
  • ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Development of EEG-based brain computer interface (BCI) methods has largely focused on creating a communication channel for subjects with intact cognition but profound loss of motor control from stroke or neurodegenerative disease that allows such subjects to communicate by spelling out words on a personal computer. However, other important human communication channels may also be limited or unavailable for handicapped subjects -- direct non-linguistic emotional communication by gesture, vocal prosody, facial expression, etc.. We report and examine a first demonstration of a musical 'emotion BCI' in which, as one element of a live musical performance, an able-bodied subject successfully engaged the electronic delivery of an ordered sequence of five music two-tone bass frequency drone sounds by imaginatively re-experiencing the human feeling he had spontaneously associated with the sound of each drone sound during training sessions. The EEG data included activities of both brain and non-brain sources (scalp muscles, eye movements). Common Spatial Pattern classification gave 84% correct pseudo-online performance and 5-of-5 correct classification in live performance. Re-analysis of the training session data including only the brain EEG sources found by multiple-mixture Amica ICA decomposition achieved five-class classification accuracy of 59-70%, confirming that different voluntary emotion imagination experiences may be associated with distinguishable brain source EEG dynamics.