Enhancement of human computer interaction with facial electromyographic sensors

  • Authors:
  • Guillaume Gibert;Martin Pruzinec;Tanja Schultz;Catherine Stevens

  • Affiliations:
  • University of Western Sydney, Penrith South DC, NSW, Australia;University of Karlsruhe (TH), Karlsruhe, Germany;University of Karlsruhe (TH), Karlsruhe, Germany;University of Western Sydney, Penrith South DC, NSW, Australia

  • Venue:
  • OZCHI '09 Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we describe a way to enhance human computer interaction using facial Electromyographic (EMG) sensors. Indeed, to know the emotional state of the user enables adaptable interaction specific to the mood of the user. This way, Human Computer Interaction (HCI) will gain in ergonomics and ecological validity. While expressions recognition systems based on video need exaggerated facial expressions to reach high recognition rates, the technique we developed using electrophysiological data enables faster detection of facial expressions and even in the presence of subtle movements. Features from 8 EMG sensors located around the face were extracted. Gaussian models for six basic facial expressions - anger, surprise, disgust, happiness, sadness and neutral - were learnt from these features and provide a mean recognition rate of 92%. Finally, a prototype of one possible application of this system was developed wherein the output of the recognizer was sent to the expressions module of a 3D avatar that then mimicked the expression.