Articulated Model Based People Tracking Using Motion Models

  • Authors:
  • Huazhong Ning

  • Affiliations:
  • -

  • Venue:
  • ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper focuses on acquisition of human motion data such as joint angles and velocity for applications of virtual reality, using both articulated body model and motion model in the CONDENSATION framework. Firstly, we learn a motion model represented by Gaussian distributions, and explore motion constraints by considering the dependency of motion parameters and represent them as conditional distributions. Then both of them are integrated into the dynamic model to concentrate factored sampling in the areas of state-space with mostposterior information. To measure the observing density with accuracy and robustness, a PEF (Pose Evaluation Function) modeled with a radial term is proposed. We also address the issue of automatic acquisition of initial model posture and recovery from severe failures. A large number of experiments on several persons demonstrate that our approach works well.