Using proprioceptive sensors for categorizing human-robot interactions

  • Authors:
  • T. Salter;F. Michaud;D. Létourneau;D. C. Lee;I. P. Werry

  • Affiliations:
  • Université de Sherbrooke, Sherbrooke, Québec, Canada;Université de Sherbrooke, Sherbrooke, Québec, Canada;Université de Sherbrooke, Sherbrooke, Québec, Canada;University of Hertfordshire, Hertfordshire, England;University of Hertfordshire, Hertfordshire, England

  • Venue:
  • Proceedings of the ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Increasingly researchers are looking outside of normal communication channels (such as video and audio) to provide additional forms of communication or interaction between a human and a robot, or a robot and its environment. Amongst the new channels being investigated is the detection of touch using infrared, proprioceptive and temperature sensors. Our work aims at developing a system that can detect natural touch or interaction coming from children playing with a robot, and adapt to this interaction. This paper reports trials carried out using Roball, a spherical mobile robot, demonstrating how sensory data patterns can be identified in human-robot interaction, and exploited for achieving behavioral adaptation. The experimental methodology used for these trials is reported, which validated the hypothesis that human interaction can not only be perceived from proprioceptive sensors on-board a robotic platform, but that this perception has the ability to lead to adaptation.