The role of emotion in believable agents
Communications of the ACM
Designing Sociable Robots
Digital Multimedia
Computer Animation and Virtual Worlds - Special Issue: The Very Best Papers from CASA 2004
Cross-cultural differences in recognizing affect from body posture
Interacting with Computers
Evaluating the emotional content of human motions on real and virtual characters
Proceedings of the 5th symposium on Applied perception in graphics and visualization
Computer Animation and Virtual Worlds
Too real for comfort? Uncanny responses to computer generated faces
Computers in Human Behavior
Emotions and Messages in Simple Robot Gestures
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 4
Perception of affect elicited by robot motion
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Interpretation of emotional body language displayed by robots
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Using the interaction rhythm as a natural reinforcement signal for social robots: a matter of belief
ICSR'10 Proceedings of the Second international conference on Social robotics
FearNot’s appearance: reflecting children’s expectations and perspectives
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
I show you how I like you - can you read it in my face? [robotics]
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Automatic Recognition of Non-Acted Affective Postures
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Complex and natural social interaction between artificial agents (computer-generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant, and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve naturalness. This research investigates the creation of an affect space for the generation of emotional body language to be displayed by humanoid robots. To do so, three experiments investigating how emotional body language displayed by agents is interpreted were conducted. The first experiment compared the interpretation of emotional body language displayed by humans and agents. The results showed that emotional body language displayed by an agent or a human is interpreted in a similar way in terms of recognition. Following these results, emotional key poses were extracted from an actor's performances and implemented in a Nao robot. The interpretation of these key poses was validated in a second study where it was found that participants were better than chance at interpreting the key poses displayed. Finally, an affect space was generated by blending key poses and validated in a third study. Overall, these experiments confirmed that body language is an appropriate medium for robots to display emotions and suggest that an affect space for body expressions can be used to improve the expressiveness of humanoid robots.