Automatic analysis of affective postures and body motion to detect engagement with a game companion

  • Authors:
  • Jyotirmay Sanghvi;Ginevra Castellano;Iolanda Leite;André Pereira;Peter W. McOwan;Ana Paiva

  • Affiliations:
  • Queen Mary University of London, London, United Kingdom;Queen Mary University of London, London, United Kingdom;Instituto Superior Técnico, Porto Salvo, Portugal;Instituto Superior Técnico, Porto Salvo, Portugal;Queen Mary University of London, London, United Kingdom;Instituto Superior Técnico, Porto Salvo, Portugal

  • Venue:
  • Proceedings of the 6th international conference on Human-robot interaction
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess with an iCat robot that acts as a game companion. Our approach is based on vision-based automatic extraction of expressive postural features from videos capturing the behaviour of the children from a lateral view. An initial evaluation, conducted by training several recognition models with contextualised affective postural expressions, suggests that patterns of postural behaviour can be used to accurately predict the engagement of the children with the robot, thus making our approach suitable for integration into an affect recognition system for a game companion in a real world scenario.