International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
iCat: an animated user-interface robot with personality
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Automatic prediction of frustration
International Journal of Human-Computer Studies
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Detecting Affect from Non-stylised Body Motions
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Recognising Human Emotions from Body Movement and Gesture Dynamics
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Automatic Detection of Learner's Affect From Gross Body Language
Applied Artificial Intelligence
Body movement analysis of human-robot interaction
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Detecting user engagement with a robot companion using task and social interaction-based features
Proceedings of the 2009 international conference on Multimodal interfaces
Automatic temporal segment detection and affect recognition from face and body display
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
Estimating user's engagement from eye-gaze behaviors in human-agent conversations
Proceedings of the 15th international conference on Intelligent user interfaces
Recognizing engagement in human-robot interaction
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Affective State Estimation for Human–Robot Interaction
IEEE Transactions on Robotics
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Modelling empathy in social robotic companions
UMAP'11 Proceedings of the 19th international conference on Advances in User Modeling
Thin slices of interaction: predicting users' task difficulty within 60 sec.
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Multimodal analysis of the implicit affective channel in computer-mediated textual communication
Proceedings of the 14th ACM international conference on Multimodal interaction
Combining verbal and nonverbal features to overcome the 'information gap' in task-oriented dialogue
SIGDIAL '12 Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Image and Vision Computing
Video analysis of approach-avoidance behaviors of teenagers speaking with virtual agents
Proceedings of the 15th ACM on International conference on multimodal interaction
Towards affect sensitive and socially perceptive companions
Your Virtual Butler
Hi-index | 0.00 |
The design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess with an iCat robot that acts as a game companion. Our approach is based on vision-based automatic extraction of expressive postural features from videos capturing the behaviour of the children from a lateral view. An initial evaluation, conducted by training several recognition models with contextualised affective postural expressions, suggests that patterns of postural behaviour can be used to accurately predict the engagement of the children with the robot, thus making our approach suitable for integration into an affect recognition system for a game companion in a real world scenario.