Reducing drift in differential tracking

  • Authors:
  • Ali Rahimi;Louis-Philippe Morency;Trevor Darrell

  • Affiliations:
  • Massachusetts Institute of Technology, Computer Science and AI Lab, Cambridge, MA 02139, USA;Massachusetts Institute of Technology, Computer Science and AI Lab, Cambridge, MA 02139, USA;Massachusetts Institute of Technology, Computer Science and AI Lab, Cambridge, MA 02139, USA

  • Venue:
  • Computer Vision and Image Understanding
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present methods for turning pair-wise registration algorithms into drift-free trackers. Such registration algorithms are abundant, but the simplest techniques for building trackers on top of them exhibit either limited tracking range or drift. Our algorithms maintain the poses associated with a number of key frames, building a view-based appearance model that is used for tracking and refined during tracking. The first method we propose is batch oriented and is ideal for offline tracking. The second is suited for recovering egomotion in large environments where the trajectory of the camera rarely intersects itself, and in other situations where many views are necessary to capture the appearance of the scene. The third method is suitable for situations where a few views are sufficient to capture the appearance of the scene, such as object-tracking. We demonstrate the techniques on egomotion and head-tracking examples and show that they can track for an indefinite amount of time without accumulating drift.