State-driven particle filter for multi-person tracking

  • Authors:
  • David Gerónimo Gomez;Frédéric Lerasle;Antonio M. López Peña

  • Affiliations:
  • Computer Vision Center and Department of Computer Science, Universitat Autònoma de Barcelona, Bellaterra, Spain;CNRS-LAAS, Toulouse, France,Université de Toulouse (UPS), Toulouse, France;Computer Vision Center and Department of Computer Science, Universitat Autònoma de Barcelona, Bellaterra, Spain

  • Venue:
  • ACIVS'12 Proceedings of the 14th international conference on Advanced Concepts for Intelligent Vision Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-person tracking can be exploited in applications such as driver assistance, surveillance, multimedia and human-robot interaction. With the help of human detectors, particle filters offer a robust method able to filter noisy detections and provide temporal coherence. However, some traditional problems such as occlusions with other targets or the scene, temporal drifting or even the lost targets detection are rarely considered, making the systems performance decrease. Some authors propose to overcome these problems using heuristics not explained and formalized in the papers, for instance by defining exceptions to the model updating depending on tracks overlapping. In this paper we propose to formalize these events by the use of a state-graph, defining the current state of the track (e.g., potential, tracked, occluded or lost) and the transitions between states in an explicit way. This approach has the advantage of linking track actions such as the online underlying models updating, which gives flexibility to the system. It provides an explicit representation to adapt the multiple parallel trackers depending on the context, i.e., each track can make use of a specific filtering strategy, dynamic model, number of particles, etc. depending on its state. We implement this technique in a single-camera multi-person tracker and test it in public video sequences.