Computing emotion awareness through facial electromyography

  • Authors:
  • Egon L. van den Broek;Marleen H. Schut;Joyce H. D. M. Westerink;Jan van Herk;Kees Tuinenbreijer

  • Affiliations:
  • Center for Telematics and Information Technology (CTIT) / Institute for Behavioral Research (IBR), University of Twente, Enschede, The Netherlands;Department of Artificial Intelligence / Nijmegen Institute for Cognition and Information (NICI), Radboud University Nijmegen, Nijmegen, The Netherlands;Philips Research, Eindhoven, The Netherlands;Philips Research, Eindhoven, The Netherlands;Philips Consumer Electronics, The Innovation Laboratories, Eindhoven, The Netherlands

  • Venue:
  • ECCV'06 Proceedings of the 2006 international conference on Computer Vision in Human-Computer Interaction
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

To improve human-computer interaction (HCI), computers need to recognize and respond properly to their user's emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes emotions of individual users, this research focuses on how emotional experiences are expressed in six parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) of physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). The 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. The skewness of the EMG2 and four parameters of EMG3, discriminate between the four emotion categories. This, despite the coarse time windows that were used. Moreover, rapid processing of the signals proved to be possible. This enables tailored HCI facilitated by an emotional awareness of systems.