Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors

  • Authors:
  • Per Ola Kristensson;Thomas Nicholson;Aaron Quigley

  • Affiliations:
  • University of St Andrews, St Andrews, Fife, United Kingdom;University of St Andrews, St Andrews, Fife, United Kingdom;University of St Andrews, St Andrews, Fife, United Kingdom

  • Venue:
  • Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users' intended one-handed and twohanded gestures while they are still being articulated. It supports scale and translation invariant recognition of arbitrarily defined gesture templates in real-time. The interface supports two ways of gesturing commands in thin air to displays at a distance. First, users can use one-handed and two-handed gestures to directly issue commands. Second, users can use their non-dominant hand to modulate single-hand gestures. Our evaluation shows that the system recognizes one-handed and two-handed gestures with an accuracy of 92.7%--96.2%.