Smartphones get emotional: mind reading images and reconstructing the neural sources

  • Authors:
  • Michael Kai Petersen;Carsten Stahlhut;Arkadiusz Stopczynski;Jakob Eg Larsen;Lars Kai Hansen

  • Affiliations:
  • DTU Informatics, Cognitive Systems, Technical University of Denmark, Lyngby, Denmark;DTU Informatics, Cognitive Systems, Technical University of Denmark, Lyngby, Denmark;DTU Informatics, Cognitive Systems, Technical University of Denmark, Lyngby, Denmark;DTU Informatics, Cognitive Systems, Technical University of Denmark, Lyngby, Denmark;DTU Informatics, Cognitive Systems, Technical University of Denmark, Lyngby, Denmark

  • Venue:
  • ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Combining a wireless EEG headset with a smartphone offers new opportunities to capture brain imaging data reflecting our everyday social behavior in a mobile context. However processing the data on a portable device will require novel approaches to analyze and interpret significant patterns in order to make them available for runtime interaction. Applying a Bayesian approach to reconstruct the neural sources we demonstrate the ability to distinguish among emotional responses reflected in different scalp potentials when viewing pleasant and unpleasant pictures compared to neutral content. Rendering the activations in a 3D brain model on a smartphone may not only facilitate differentiation of emotional responses but also provide an intuitive interface for touch based interaction, allowing for both modeling the mental state of users as well as providing a basis for novel bio-feedback applications.