3D-tracking of head and hands for pointing gesture recognition in a human-robot interaction scenario

  • Authors:
  • Kai Nickel;Edgar Seemann;Rainer Stiefelhagen

  • Affiliations:
  • Interactive Systems Labs, Universitt Karlsruhe, Germany;Interactive Systems Labs, Universitt Karlsruhe, Germany;Interactive Systems Labs, Universitt Karlsruhe, Germany

  • Venue:
  • FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present our approach for visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo-camera, color and disparity information are integrated into a multi-hypotheses tracking framework in order to find the 3D-positions of the respective body parts. Based on the hands' motion, an HMM based approach is applied to recognize pointing gestures. We show experimentally, that the gesture recognition performance can be improved significantly by using visually gained information about head orientation as an additional feature. Our system aims at applications in the field of human-robot interaction, where it is important to do runon recognition in real-time, to allow for robot's egomotion and not to rely on manual initialization.