Embedded multimodal nonverbal and verbal interactions between a mobile toy robot and autistic children

  • Authors:
  • Irini Giannopulu

  • Affiliations:
  • UCP & UPMC, Paris, France

  • Venue:
  • Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We studied the multimodal nonverbal and verbal relationship between autistic children and a mobile toy robot during free spontaneous game play. A range of cognitive nonverbal criteria including eye contact, touch, manipulation, and posture were analyzed; the frequency of the words and verbs was calculated. Embedded multimodal interactions of autistic children and a mobile toy robot suggest that this robot could be used as a neural orthesis in order to improve children's brain activity and incite child to express language.