Static and dynamic hand-gesture recognition for augmented reality applications

  • Authors:
  • Stefan Reifinger;Frank Wallhoff;Markus Ablassmeier;Tony Poitschke;Gerhard Rigoll

  • Affiliations:
  • Technische Universität München, Institute for Man Machine Communication, Munich, Germany;Technische Universität München, Institute for Man Machine Communication, Munich, Germany;Technische Universität München, Institute for Man Machine Communication, Munich, Germany;Technische Universität München, Institute for Man Machine Communication, Munich, Germany;Technische Universität München, Institute for Man Machine Communication, Munich, Germany

  • Venue:
  • HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This contribution presents our approach for an instrumented automatic gesture recognition system for use in Augmented Reality, which is able to differentiate static and dynamic gestures. Basing on an infrared tracking system, infrared targets mounted at the users thumbs and index fingers are used to retrieve information about position and orientation of each finger. Our system receives this information and extracts static gestures by distance classifiers and dynamic gestures by statistical models. The concluded gesture is provided to any connected application. We introduce a small demonstration as basis for a short evaluation. In this we compare interaction in a real environment, Augmented Reality with a mouse/keyboard, and our gesture recognition system concerning properties, such as task execution time or intuitiveness of interaction. The results show that tasks executed by interaction with our gesture recognition system are faster than using the mouse/keyboard. However, this enhancement entails a slightly lowered wearing comfort.