Finger tracking methods using eyesweb

  • Authors:
  • Anne-Marie Burns;Barbara Mazzarino

  • Affiliations:
  • Input Devices and Music Interaction Lab, Schulich School of Music, McGill University, Montréal, Québec, Canada;InfoMus Lab, DIST – University of Genova, Genova, Italy

  • Venue:
  • GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper compares different algorithms for tracking the position of fingers in a two-dimensional environment. Four algorithms have been implemented in EyesWeb, developed by DIST-InfoMus laboratory. The three first algorithms use projection signatures, the circular Hough transform, and geometric properties, and rely only on hand characteristics to locate the finger. The fourth algorithm uses color markers and is employed as a reference system for the other three. All the algorithms have been evaluated using two-dimensional video images of a hand performing different finger movements on a flat surface. Results about the accuracy, precision, latency and computer resource usage of the different algorithms are provided. Applications of this research include human-computer interaction systems based on hand gesture, sign language recognition, hand posture recognition, and gestural control of music.