Form as a cue in the automatic recognition of non-acted affective body expressions
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Multi-score learning for affect recognition: the case of body postures
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Towards real-time affect detection based on sample entropy analysis of expressive gesture
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Emotional body language displayed by artificial agents
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
Children interpretation of emotional body language displayed by a robot
ICSR'11 Proceedings of the Third international conference on Social Robotics
What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?
ACM Transactions on Computer-Human Interaction (TOCHI)
Ubiquitous emotion-aware computing
Personal and Ubiquitous Computing
Exploring body language as narrative interface
ICIDS'12 Proceedings of the 5th international conference on Interactive Storytelling
Image and Vision Computing
Comparing four technologies for measuring postural micromovements during monitor engagement
Proceedings of the 30th European Conference on Cognitive Ergonomics
Proceedings of the 31st European Conference on Cognitive Ergonomics
Natural interaction expressivity modeling and analysis
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
Proceedings of the 3rd ACM international workshop on Audio/visual emotion challenge
Hi-index | 0.00 |
The conveyance and recognition of affect and emotion partially determine how people interact with others and how they carry out and perform in their day-to-day activities. Hence, it is becoming necessary to endow technology with the ability to recognize users' affective states to increase the technologies' effectiveness. This paper makes three contributions to this research area. First, we demonstrate recognition models that automatically recognize affective states and affective dimensions from non-acted body postures instead of acted postures. The scenario selected for the training and testing of the automatic recognition models is a body-movement-based video game. Second, when attributing affective labels and dimension levels to the postures represented as faceless avatars, the level of agreement for observers was above chance level. Finally, with the use of the labels and affective dimension levels assigned by the observers as ground truth and the observers' level of agreement as base rate, automatic recognition models grounded on low-level posture descriptions were built and tested for their ability to generalize to new observers and postures using random repeated subsampling validation. The automatic recognition models achieve recognition percentages comparable to the human base rates as hypothesized.