A Unified Framework for Tracking through Occlusions and across Sensor Gaps

  • Authors:
  • Robert Kaucic;A. G. Amitha Perera;Glen Brooksby;John Kaufhold;Anthony Hoogs

  • Affiliations:
  • General Electric Global Research;General Electric Global Research;General Electric Global Research;General Electric Global Research;General Electric Global Research

  • Venue:
  • CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A common difficulty encountered in tracking applications is how to track an object that becomes totally occluded, possibly for a significant period of time. Another problem is how to associate objects, or tracklets, across non-overlapping cameras, or between observations of a moving sensor that switches fields of regard. A third problem is how to update appearance models for tracked objects over time. As opposed to using a comprehensive multi-object tracker that must simultaneously deal with these tracking challenges, we present a novel, modular framework that handles each of these problems in a unified manner by the initialization, tracking, and linking of high-confidence tracklets. In this track/suspend/match paradigm, we first analyze the scene to identify areas where tracked objects are likely to become occluded. Tracking is then suspended on occluded objects and re-initiated when they emerge from behind the occlusion. We then associate, or match, suspended tracklets with the new tracklets using full kinematic models for object motion and Gibbsian distributions for object appearance in order to complete the track through the occlusion. Sensor gaps are handled in a similar manner, where tracking is suspended when the sensor looks away and then re-initiated when the sensor returns. Changes in object appearance and orientation during tracking are also seamlessly handled in this framework. Tracklets with low lock scores are terminated. Tracking then resumes on untracked movers with corresponding updated appearance models. These new tracklets are then linked back to the terminated ones as appropriate. Fully automatic tracking results from a moving sensor are presented.