Robust feature descriptors for efficient vision-based tracking

  • Authors:
  • Gerardo Carrera;Jesus Savage;Walterio Mayol-Cuevas

  • Affiliations:
  • Universidad Nacional Autonoma de Mexico, Department of Electrical Engineering, Bio-Robotics Laboratory, Mexico City, Mexico;Universidad Nacional Autonoma de Mexico, Department of Electrical Engineering, Bio-Robotics Laboratory, Mexico City, Mexico;University of Bristol, Computer Science, Bristol, UK

  • Venue:
  • CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a robust implementation of an object tracker able to tolerate partial occlusions, rotation and scale for a variety of different objects. The objects are represented by collections of interest points which are described in a multi-resolution framework, giving a representation of those points at different scales. Inspired by [1], a stack of descriptors is built only the first time that the interest points are detected and extracted from the region of interest. This provides efficiency of representation and results in faster tracking due to the fact that it can be done off-line. An Unscented Kalman Filter (UKF) using a constant velocity model estimates the position and the scale of the object, with the uncertainty in the position and the scale obtained by the UKF, the search of the object can be constrained only in a specific region in both the image and in scale. The use of this approach shows an improvement in real-time tracking and in the ability to recover from full occlusions.