FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
A 3D Facial Expression Database For Facial Behavior Research
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Block Selection in the Local Appearance-based Face Recognition Scheme
CVPRW '06 Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
Facial Action Unit Recognition by Exploiting Their Dynamic and Semantic Relationships
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Facial Expression Recognition Based on 3D Dynamic Range Model Sequences
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part II
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Interpreting hand-over-face gestures
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Multimodal analysis of the implicit affective channel in computer-mediated textual communication
Proceedings of the 14th ACM international conference on Multimodal interaction
Crowdsouring in emotion studies across time and culture
Proceedings of the ACM multimedia 2012 workshop on Crowdsourcing for multimedia
Multimodal intelligent affect detection with Kinect: extended abstract
Proceedings of the 2013 international conference on Autonomous agents and multi-agent systems
Proceedings of the 45th ACM technical symposium on Computer science education
Hi-index | 0.00 |
Hand-over-face gestures, a subset of emotional body language, are overlooked by automatic affect inference systems. We propose the use of hand-over-face gestures as a novel affect cue for automatic inference of cognitive mental states. Moreover, affect recognition systems rely on the existence of publicly available datasets, often the approach is only as good as the data. We present the collection and annotation methodology of a 3D multimodal corpus of 108 audio/video segments of natural complex mental states. The corpus includes spontaneous facial expressions and hand gestures labelled using crowd-sourcing and is publicly available.