Robust online appearance models for visual tracking

  • Authors:
  • Allan D. Jepson;Thomas Farid El-Maraghi

  • Affiliations:
  • -;-

  • Venue:
  • Robust online appearance models for visual tracking
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

A framework for learning robust, adaptive appearance models to be used for motion-based tracking of natural objects is proposed. The approach involves a mixture of stable image structure, learned over long time courses, along with 2-frame transient motion information and an outlier process. This class of appearance models is shown to have low storage requirements and that the model parameters can be learned efficiently with an on-line variant of the expectation-maximization (EM) algorithm. Two implementations of appearance models based on the framework presented here are developed. The first is based on the filter responses from a steerable pyramid. This model is used in a motion-based tracking algorithm to provide robustness in the face of image outliers, such as those caused by occlusions. It also provides the ability to adapt to natural changes in appearance, such as those due to facial expressions or variations in 3D pose. The second implementation is based on a robust representation of the color of the target object. Experimental results on a variety of natural image sequences of people moving within cluttered environments are shown, both for the wavelet- and color-based appearance models individually and in combination with one another. A quantitative analysis of the performance of the tracker is provided for one of the test sequences.