Position-invariant, real-time gesture recognition based on dynamic time warping

  • Authors:
  • Saša Bodiroža;Guillaume Doisy;Verena Vanessa Hafner

  • Affiliations:
  • Humboldt-Universität zu Berlin, Berlin, Germany;Ben-Gurion University of the Negev, Beer Sheva, Israel;Humboldt-Universität zu Berlin, Berlin, Germany

  • Venue:
  • Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

To achieve an improved human-robot interaction it is necessary to allow the human participant to interact with the robot in a natural way. In this work, a gesture recognition algorithm, based on dynamic time warping, was implemented with a use-case scenario of natural interaction with a mobile robot. Inputs are gesture trajectories obtained using a Microsoft Kinect sensor. Trajectories are stored in the person's frame of reference. Furthermore, the recognition is position-invariant, meaning that only one learned sample is needed to recognize the same gesture performed at another position in the gestural space. In experiments, a set of gestures for a robot waiter was used to train the gesture recognition algorithm. The experimental results show that the proposed modifications of the standard gesture recognition algorithm improve the robustness of the recognition.