Children interpretation of emotional body language displayed by a robot

  • Authors:
  • Aryel Beck;Lola Cañamero;Luisa Damiano;Giacomo Sommavilla;Fabio Tesser;Piero Cosi

  • Affiliations:
  • Adaptive Systems Research Group, School of Computer Science & STRI, University of Hertfordshire, United Kingdom;Adaptive Systems Research Group, School of Computer Science & STRI, University of Hertfordshire, United Kingdom;Adaptive Systems Research Group, School of Computer Science & STRI, University of Hertfordshire, United Kingdom;Institute of Cognitive Sciences and Technologies, Padova, Italy;Institute of Cognitive Sciences and Technologies, Padova, Italy;Institute of Cognitive Sciences and Technologies, Padova, Italy

  • Venue:
  • ICSR'11 Proceedings of the Third international conference on Social Robotics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3]. Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.