Tracking objects with generic calibrated sensors: An algorithm based on color and 3D shape features

  • Authors:
  • M. Taiana;J. Santos;J. Gaspar;J. Nascimento;A. Bernardino;P. Lima

  • Affiliations:
  • Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal;Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal;Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal;Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal;Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal;Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisboa, Portugal

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a color and shape based 3D tracking system suited to a large class of vision sensors. The method is applicable, in principle, to any known calibrated projection model. The tracking architecture is based on particle filtering methods where each particle represents the 3D state of the object, rather than its state in the image, therefore overcoming the nonlinearity caused by the projection model. This allows the use of realistic 3D motion models and easy incorporation of self-motion measurements. All nonlinearities are concentrated in the observation model so that each particle projects a few tens of special points onto the image, on (and around) the 3D object's surface. The likelihood of each state is then evaluated by comparing the color distributions inside and outside the object's occluding contour. Since only pixel access operations are required, the method does not require the use of image processing routines like edge/feature extraction, color segmentation or 3D reconstruction, which can be sensitive to motion blur and optical distortions typical in applications of omnidirectional sensors to robotics. We show tracking applications considering different objects (balls, boxes), several projection models (catadioptric, dioptric, perspective) and several challenging scenarios (clutter, occlusion, illumination changes, motion and optical blur). We compare our methodology against a state-of-the-art alternative, both in realistic tracking sequences and with ground truth generated data.