Joint Likelihood Methods for Mitigating Visual Tracking Disturbances

  • Authors:
  • Christopher Rasmussen

  • Affiliations:
  • -

  • Venue:
  • WOMOT '01 Proceedings of the IEEE Workshop on Multi-Object Tracking (WOMOT'01)
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Abstract: We describe a framework that explicitly reasons about data association and combines estimates to improve tracking performance in many difficult visual environments. This work extends two previously reported algorithms: the PDAF, which handles single-target tracking tasks involving agile motions and clutter, and the JPDAF, which shares information between multiple same-modality trackers (such as homogeneous regions, textured regions, or snakes). The capabilities of these methods are improved in two steps: first, by a Joint Likelihood Filter that allows mixed tracker modalities when tracking several objects and accommodates overlaps robustly. A second technique, the Constrained Joint Likelihood Filter, tracks complex objects as conjunctions of cues that are diverse both geometrically (e.g., parts) and qualitatively (e.g., attributes). Rigid and hinge constraints between part trackers and multiple descriptive attributes for individual parts render the whole object more distinctive, reducing susceptibility to mistracking. The generality of our approach allows for easy application to different target types, and it is flexibly defined for straightforward incorporation of other modalities.