Expression glasses: a wearable device for facial expression recognition
CHI '99 Extended Abstracts on Human Factors in Computing Systems
Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces-Volume I - Volume I
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Pupil size variation as an indication of affective processing
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Wearable and automotive systems for affect recognition from physiology
Wearable and automotive systems for affect recognition from physiology
Facial expression recognition from video sequences: temporal and static modeling
Computer Vision and Image Understanding - Special issue on Face recognition
Gazing and frowning as a new human--computer interaction technique
ACM Transactions on Applied Perception (TAP)
EMPATH: A Neural Network that Categorizes Facial Expressions
Journal of Cognitive Neuroscience
Person-independent estimation of emotional experiences from facial expressions
Proceedings of the 10th international conference on Intelligent user interfaces
Recognizing the effects of voluntary facial activations using heart rate patterns
ICCOMP'07 Proceedings of the 11th WSEAS International Conference on Computers
Facial Activation Control Effect (FACE)
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
How Is It for You? (A Case for Recognising User Motivation in the Design Process)
Affect and Emotion in Human-Computer Interaction
ACM Transactions on Applied Perception (TAP)
Multimodal interfaces: Challenges and perspectives
Journal of Ambient Intelligence and Smart Environments
Haptic interaction becomes reality
Journal of Ambient Intelligence and Smart Environments
Learning patterns in ambient intelligence environments: a survey
Artificial Intelligence Review
Measuring instant emotions during a self-assessment test: the use of FaceReader
Proceedings of the 7th International Conference on Methods and Techniques in Behavioral Research
Identifying emotional states using keystroke dynamics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Information Processing and Management: an International Journal
ACM Transactions on Applied Perception (TAP)
Haptic interaction becomes reality
Journal of Ambient Intelligence and Smart Environments
Multimodal interfaces: Challenges and perspectives
Journal of Ambient Intelligence and Smart Environments
Measuring instant emotions based on facial expressions during computer-based assessment
Personal and Ubiquitous Computing
Hi-index | 0.00 |
The present aim was to develop methods that estimate emotional experiences in real time from the electromyographic activity of two facial muscles: zygomaticus major (activated when smiling) and corrugator supercilii (activated when frowning). Ten subjects were stimulated with a series of emotionally arousing pictures and videos. After each stimulus the subjects rated the valence of their emotional experience on a nine-point bipolar dimensional scale. At the same time the computer estimated the subjects' ratings on the basis of their electrical facial activity during each stimulation with 70 computational models. The models estimated the subjects' ratings either categorically or dimensionally with regression models. The best categorical models were able to estimate negative and positive ratings with an average accuracy of over 70 and 80% for pictures and videos, respectively. The best correlations between the human ratings and machine estimations formed with the regression models were high (r0.9). These findings indicate that models estimating psycho-emotional experiences on the basis of facial activity can be created successfully in several ways.