Tracking articulated objects by learning intrinsic structure of motion

  • Authors:
  • Xinxiao Wu;Wei Liang;Yunde Jia

  • Affiliations:
  • Beijing Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, PR China;Beijing Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, PR China;Beijing Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, PR China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper, we propose a novel dimensionality reduction method, temporal neighbor preserving embedding (TNPE), to learn the low-dimensional intrinsic motion manifold of articulated objects. The method simultaneously learns the embedding manifold and the mapping from an image feature space to an embedding space by preserving the local temporal relationship hidden in sequential data points. Then tracking is formulated as the problem of estimating the configuration of an articulated object from the learned central embedding representation. To solve this problem, we combine Bayesian mixture of experts (BME) with Gaussian mixture model (GMM) to establish a probabilistic non-linear mapping from the embedding space to the configuration space. The experimental result on articulated hand and human pose tracking shows an encouraging performance on stability and accuracy.