Model-Based Tracking by Classification in a Tiny Discrete Pose Space

  • Authors:
  • Limin Shang;Piotr Jasiobedzki;Michael Greenspan

  • Affiliations:
  • IEEE;IEEE;IEEE

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.14

Visualization

Abstract

A method is presented for tracking 3D objects as they transform rigidly in space within a sparse range image sequence. The method operates in discrete space and exploits the coherence across image frames that results from the relationship between known bounds on the object's velocity and the sensor frame rate. These motion bounds allow the interframe transformation space to be reduced to a reasonable and indeed tiny size, comprising only tens or hundreds of possible states. The tracking problem is in this way cast into a classification framework, effectively trading off localization precision for runtime efficiency and robustness. The method has been implemented and tested extensively on a variety of freeform objects within a sparse range data stream comprising only a few hundred points per image. It has been shown to compare favorably against continuous domain Iterative Closest Point (ICP) tracking methods, performing both more efficiently and more robustly. A hybrid method has also been implemented that executes a small number of ICP iterations following the initial discrete classification phase. This hybrid method is both more efficient than the ICP alone and more robust than either the discrete classification method or the ICP separately.