Task oriented facial behavior recognition with selective sensing
Computer Vision and Image Understanding
Toward a decision-theoretic framework for affect recognition and user assistance
International Journal of Human-Computer Studies - Human-computer interaction research in the managemant information systems discipline
Spontaneous vs. posed facial behavior: automatic analysis of brow actions
Proceedings of the 8th international conference on Multimodal interfaces
Gaze-X: adaptive affective multimodal interface for single-user office scenarios
Proceedings of the 8th international conference on Multimodal interfaces
Human computing and machine understanding of human behavior: a survey
Proceedings of the 8th international conference on Multimodal interfaces
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Facial Action Unit Recognition by Exploiting Their Dynamic and Semantic Relationships
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eliciting, capturing and tagging spontaneous facialaffect in autism spectrum disorder
Proceedings of the 9th international conference on Multimodal interfaces
A survey of affect recognition methods: audio, visual and spontaneous expressions
Proceedings of the 9th international conference on Multimodal interfaces
Reconstruction and Recognition of Occluded Facial Expressions Using PCA
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Facial expression recognition based on Local Binary Patterns: A comprehensive study
Image and Vision Computing
ACM Transactions on Accessible Computing (TACCESS)
Image and Vision Computing
Task oriented facial behavior recognition with selective sensing
Computer Vision and Image Understanding
An exploration of user engagement in HCI
Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots
Automatic temporal segment detection and affect recognition from face and body display
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
A binary decision tree based real-time emotion detection system
ISVC'07 Proceedings of the 3rd international conference on Advances in visual computing - Volume Part I
Designing for interaction immediacy to enhance social skills of children with autism
Proceedings of the 12th ACM international conference on Ubiquitous computing
Human computing and machine understanding of human behavior: a survey
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Gaze-X: adaptive, affective, multimodal interface for single-user office scenarios
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Information Processing and Management: an International Journal
Adaptive facial expression recognition using inter-modal top-down context
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Analysis of landmarks in recognition of face expressions
Pattern Recognition and Image Analysis
Multimodal human computer interaction: a survey
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
Thin slices of interaction: predicting users' task difficulty within 60 sec.
CHI '12 Extended Abstracts on Human Factors in Computing Systems
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special issue on highlights of the decade in interactive intelligent systems
Advocating a Componential Appraisal Model to Guide Emotion Recognition
International Journal of Synthetic Emotions
MACH: my automated conversation coach
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
This paper presents a system for inferring complex mental states from video of facial expressions and head gestures in real-time. The system is based on a multi-level dynamic Bayesian network classifier which models complex mental states as a number of interacting facial and head displays, identified from component-based facial features. Experimental results for 6 mental states groups- agreement, concentrating, disagreement, interested, thinking and unsure are reported. Real-time performance, unobtrusiveness and lack of preprocessing make our system particularly suitable for user-independent human computer interaction.