Coding, Analysis, Interpretation, and Recognition of Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion
International Journal of Computer Vision
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Face to interface: facial affect in (hu)man and machine
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Multimodal Human Emotion/Expression Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Toward computers that recognize and respond to user emotion
IBM Systems Journal
International Journal of Human-Computer Studies
Detecting Affect from Non-stylised Body Motions
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Recognising Human Emotions from Body Movement and Gesture Dynamics
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Personal and Ubiquitous Computing
Recognizing affect from non-stylized body motion using shape of Gaussian descriptors
Proceedings of the 2010 ACM Symposium on Applied Computing
Automated analysis of non-verbal affective and social behaviour
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Form as a cue in the automatic recognition of non-acted affective body expressions
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?
ACM Transactions on Computer-Human Interaction (TOCHI)
Natural interaction expressivity modeling and analysis
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
Proceedings of the 4th Conference on Wireless Health
Discriminative functional analysis of human movements
Pattern Recognition Letters
Hi-index | 0.01 |
This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental results with different machine learning techniques show that automatic classification of this data ranges from 84% to 92% depending on how it is calculated. In order to put these automatic classification results into perspective a user study on the human perception of the same data was conducted with average classification accuracy of 93%.