Real-time social touch gesture recognition for sensate robots

  • Authors:
  • Heather Knight;Robert Toscano;Walter D. Stiehl;Angela Chang;Yi Wang;Cynthia Breazeal

  • Affiliations:
  • MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA;MIT Media Lab, Cambridge, MA

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the hardware and algorithms for a realtime social touch gesture recognition system. Early experiments involve a Sensate Bear test-rig with full body touch sensing, sensor visualization and gesture recognition capabilities. Algorithms are based on real humans interacting with a plush bear. In developing a preliminary gesture library with thirteen Symbolic Gestures and eight Touch Subtypes, we have taken the first steps toward a Robotic Touch API, showing that the Huggable robot behavior system will be able to stream currently active sensors to detect regional social gestures and local sub-gestures in realtime. The system demonstrates the infrastructure to detect three types of touching: social touch, local touch, and sensor-level touch.