Real-time human pose tracking from range data

  • Authors:
  • Varun Ganapathi;Christian Plagemann;Daphne Koller;Sebastian Thrun

  • Affiliations:
  • Computer Science Department, Stanford University, Stanford, CA, USA,Google Inc., Mountain View, CA;Computer Science Department, Stanford University, Stanford, CA, USA,Google Inc., Mountain View, CA;Computer Science Department, Stanford University, Stanford, CA;Computer Science Department, Stanford University, Stanford, CA, USA,Google Inc., Mountain View, CA

  • Venue:
  • ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part VI
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tracking human pose in real-time is a difficult problem with many interesting applications. Existing solutions suffer from a variety of problems, especially when confronted with unusual human poses. In this paper, we derive an algorithm for tracking human pose in real-time from depth sequences based on MAP inference in a probabilistic temporal model. The key idea is to extend the iterative closest points (ICP) objective by modeling the constraint that the observed subject cannot enter free space, the area of space in front of the true range measurements. Our primary contribution is an extension to the articulated ICP algorithm that can efficiently enforce this constraint. The resulting filter runs at 125 frames per second using a single desktop CPU core. We provide extensive experimental results on challenging real-world data, which show that the algorithm outperforms the previous state-of-the-art trackers both in computational efficiency and accuracy.