Coding, Analysis, Interpretation, and Recognition of Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Real-time estimation of emotional experiences from facial expressions
Interacting with Computers
Emotions and heart rate while sitting on a chair
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Combining eye movements and collaborative filtering for proactive information retrieval
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Measuring emotional valence during interactive experiences: boys at video game play
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
International Journal of Human-Computer Studies
Recognizing the effects of voluntary facial activations using heart rate patterns
ICCOMP'07 Proceedings of the 11th WSEAS International Conference on Computers
Facial Activation Control Effect (FACE)
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
The aim of this research was to develop methods for the automatic person-independent estimation of experienced emotions from facial expressions. Ten subjects watched series of emotionally arousing pictures and videos, while the electromyographic (EMG) activity of two facial muscles: zygomaticus major (activated in smiling) and corrugator supercilii (activated in frowning) was registered. Based on the changes in the activity of these two facial muscles, it was possible to distinguish between ratings of positive and negative emotional experiences at a rate of almost 70% for pictures and over 80% for videos. Using these methods, the computer could adapt its behavior according to the user's emotions during human-computer interaction.