FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
A real-time head nod and shake detector
Proceedings of the 2001 workshop on Perceptive user interfaces
Contextual recognition of head gestures
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Affective State Estimation for Human–Robot Interaction
IEEE Transactions on Robotics
A multitask approach to continuous five-dimensional affect sensing in natural speech
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
Output-associative RVM regression for dimensional and continuous emotion prediction
Image and Vision Computing
Robust continuous prediction of human emotions using multiscale dynamic cues
Proceedings of the 14th ACM international conference on Multimodal interaction
Image and Vision Computing
Hi-index | 0.00 |
This paper focuses on dimensional prediction of emotions from spontaneous conversational head gestures. It maps the amount and direction of head motion, and occurrences of head nods and shakes into arousal, expectation, intensity, power and valence level of the observed subject as there has been virtually no research bearing on this topic. Preliminary experiments show that it is possible to automatically predict emotions in terms of these five dimensions (arousal, expectation, intensity, power and valence) from conversational head gestures. Dimensional and continuous emotion prediction from spontaneous head gestures has been integrated in the SEMAINE project [1] that aims to achieve sustained emotionally-colored interaction between a human user and Sensitive Artificial Listeners.