Conversational gestures in human-robot interaction

  • Authors:
  • Paul Bremner;Anthony Pipe;Chris Melhuish;Mike Fraser;Sriram Subramanian

  • Affiliations:
  • Bristol Robotics Laboratory, Bristol, UK;Bristol Robotics Laboratory, Bristol, UK;Bristol Robotics Laboratory, Bristol, UK;Computer Science Dept., University of Bristol, Bristol, UK;Computer Science Dept., University of Bristol, Bristol, UK

  • Venue:
  • SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The human sciences have demonstrated that gesture is a critical element of human communication. While existing graphical solutions are appropriate for virtual agents, solving arm trajectories for physically embodied robots requires that we consider the challenges of robot dynamics within a real-time gesture framework. We explore and evaluate a low computational-cost gesture production algorithm that can generate adequate gesture trajectories in a humanoid torso, as judged by participants in Human-Robot gesturing studies presented in this paper. Our approach produces a constrained inverse-kinematic solution for the start and end points, and generates appropriate wrist angles. Gesture time is used to calculate the joint accelerations to give a smooth, direct hand movement. Selecting open hand gestures as an example gesture sub-domain, we implement our controller on BERTI, a bespoke upper-torso humanoid robot (Fig. 2). A qualitative pilot study highlights gesture features salient to users: gesture shape, timing, naturalness and smoothness. A controlled experimental study then demonstrates that, by these metrics, our algorithm performs well; despite some dissimilarities with users' own gestures. We establish some salient points of robot gestures based on these studies.