Nudge nudge wink wink: elements of face-to-face conversation for embodied conversational agents
Embodied conversational agents
Designing Sociable Robots
Computer Animation and Virtual Worlds - Special Issue: The Very Best Papers from CASA 2004
Digital Character Animation 3
Cross-cultural differences in recognizing affect from body posture
Interacting with Computers
Computer Animation and Virtual Worlds
Interpretation of emotional body language displayed by robots
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Automatic Recognition of Non-Acted Affective Postures
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Adaptive emotional expression in robot-child interaction
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3]. Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.