Recognizing Action Units for Facial Expression Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Analysis of emotion recognition using facial expressions, speech and multimodal information
Proceedings of the 6th international conference on Multimodal interfaces
Facial Expression Recognition with Multi-channel Deconvolution
ICAPR '09 Proceedings of the 2009 Seventh International Conference on Advances in Pattern Recognition
Learning to Make Facial Expressions
DEVLRN '09 Proceedings of the 2009 IEEE 8th International Conference on Development and Learning
Hi-index | 0.00 |
In this paper we describe a way to enhance human computer interaction using facial Electromyographic (EMG) sensors. Indeed, to know the emotional state of the user enables adaptable interaction specific to the mood of the user. This way, Human Computer Interaction (HCI) will gain in ergonomics and ecological validity. While expressions recognition systems based on video need exaggerated facial expressions to reach high recognition rates, the technique we developed using electrophysiological data enables faster detection of facial expressions and even in the presence of subtle movements. Features from 8 EMG sensors located around the face were extracted. Gaussian models for six basic facial expressions - anger, surprise, disgust, happiness, sadness and neutral - were learnt from these features and provide a mean recognition rate of 92%. Finally, a prototype of one possible application of this system was developed wherein the output of the recognizer was sent to the expressions module of a 3D avatar that then mimicked the expression.