A setup for evaluating detectors and descriptors for visual tracking

  • Authors:
  • Steffen Gauglitz;Tobias Hollerer;Petra Krahwinkler;Jurgen Rossmann

  • Affiliations:
  • Dept. of Computer Science, University of California, Santa Barbara, USA;Dept. of Computer Science, University of California, Santa Barbara, USA;Institute of Man-Machine, Interaction, RWTH Aachen University, Germany;Institute of Man-Machine, Interaction, RWTH Aachen University, Germany

  • Venue:
  • ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In many cases, visual tracking is based on detecting, describing, and then matching local features. A variety of algorithms for these steps have been proposed and used in tracking systems, leading to an increased need for independent comparisons. However, existing evaluations are geared towards object recognition and image retrieval, and their results have limited validity for real-time visual tracking. We present a setup for evaluation of detectors and descriptors which is geared towards visual tracking in terms of testbed, candidate algorithms and performance criteria. Most notably, our testbed consists of video streams with several thousand frames naturally affected by noise and motion blur.