Upper body tracking for interactive applications

  • Authors:
  • José María Buades Rubio;Francisco J. Perales;Manuel González Hidalgo;Javier Varona

  • Affiliations:
  • Ed. Anselm Turmeda, Universitat de les Illes Balears, Palma de Mallorca, Spain;Ed. Anselm Turmeda, Universitat de les Illes Balears, Palma de Mallorca, Spain;Ed. Anselm Turmeda, Universitat de les Illes Balears, Palma de Mallorca, Spain;Ed. Anselm Turmeda, Universitat de les Illes Balears, Palma de Mallorca, Spain

  • Venue:
  • AMDO'06 Proceedings of the 4th international conference on Articulated Motion and Deformable Objects
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we describe a complete method for building a perceptual user interface in indoor uncontrolled environments. The overall system uses two calibrated cameras and does initialization: it detects user, takes his/her measurements, builds a 3D-Model. It performs matching/tracking for: trunk, head, left arm, right arm and hands. The system is waiting for a user in a predefined posture, once the user has been detected he/she is analysed to take measurements are taken and a 3D-Model is built. Tracking is carried out by a Particle Filter algorithm splited in three steps: tracking of head-trunk, tracking of left arm and tracking of right arm. This proposed divide and conquer solution improves computation time without getting better or similar results than sequential solution. The matching process uses two sub-matching functions, one to compute color and another to compute shape one. Finally the system provides numerical values for joints and end effectors to be used for interactive applications