A stochastic graph evolution framework for robust multi-target tracking

  • Authors:
  • Bi Song;Ting-Yueh Jeng;Elliot Staudt;Amit K. Roy-Chowdhury

  • Affiliations:
  • Dept. of Electrical Engineering, University of California, Riverside, CA;Dept. of Electrical Engineering, University of California, Riverside, CA;Dept. of Electrical Engineering, University of California, Riverside, CA;Dept. of Electrical Engineering, University of California, Riverside, CA

  • Venue:
  • ECCV'10 Proceedings of the 11th European conference on Computer vision: Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Maintaining the stability of tracks on multiple targets in video over extended time periods remains a challenging problem. A few methods which have recently shown encouraging results in this direction rely on learning context models or the availability of training data. However, this may not be feasible in many application scenarios. Moreover, tracking methods should be able to work across different scenarios (e.g. multiple resolutions of the video) making such context models hard to obtain. In this paper, we consider the problem of long-term tracking in video in application domains where context information is not available a priori, nor can it be learned online. We build our solution on the hypothesis that most existing trackers can obtain reasonable short-term tracks (tracklets). By analyzing the statistical properties of these tracklets, we develop associations between them so as to come up with longer tracks. This is achieved through a stochastic graph evolution step that considers the statistical properties of individual tracklets, as well as the statistics of the targets along each proposed long-term track. On multiple real-life video sequences spanning low and high resolution data, we show the ability to accurately track over extended time periods (results are shown on many minutes of continuous video).