Multi-target tracking in time-lapse video forensics

  • Authors:
  • Paul Koppen;Marcel Worring

  • Affiliations:
  • University of Amsterdam, Amsterdam, Netherlands;University of Amsterdam, Amsterdam, Netherlands

  • Venue:
  • MiFor '09 Proceedings of the First ACM workshop on Multimedia in forensics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

To help an officer to efficiently review many hours of surveillance recordings, we develop a system of automated video analysis. We introduce a multi-target tracking algorithm that operates on recorded video. Apart from being robust to visual challenges (like partial and full occlusion, variation in illumination and camera view), our algorithm is also robust to temporal challenges, i.e., unknown variation in frame rate. The complication with variation in frame rate is that it invalidates motion estimation. As such, tracking algorithms that are based on motion models will show decreased performance. On the other hand, appearance based tracking suffers from a plethora of false detections. Our tracking algorithm, albeit relying on appearance based detection, deals robustly with the caveats of both approaches. The solution rests on the fact that we can make fully informed choices; not only based on preceding, but also based on following frames. It works as follows. We assume an object detection algorithm that is able to detect all target objects that are present in each frame. From this we build a graph structure. The detections form the graph's nodes. The vertices are formed by connecting each detection in one frame to all detections in the following frame. Thus, each path through the graph shows some particular selection of successive object detections. Object tracking is then reformulated as a heuristic search for optimal paths, where optimal means to find all detections belonging to a single object and excluding any other detection. We show that this approach, without an explicit motion model, is robust to both the visual and temporal challenges.