3D hand tracking for human computer interaction

  • Authors:
  • Victor Adrian Prisacariu;Ian Reid

  • Affiliations:
  • -;-

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a real-time model-based 3D hand tracker that combines image regions and the signal from an off-the-shelf 3-axis accelerometer placed on the user's hand. The visual regions allow the tracker to cope with occlusions, motion blur and background clutter, while the latter aids with the inherent silhouette-pose ambiguities. The accelerometer and tracker are synchronised by casting the calibration problem as one of principal component analysis. Based on the assumption that, often, the number of possible hand configurations is limited by the activity the hand is engaging in, we use a multiclass pose classifier to distinguish between a number of activity dependent articulated hand configurations. We demonstrate the benefits of our method, both qualitatively and quantitatively, on a variety of video sequences and hand configurations and show a proof-of-concept human computer interface based on our system.