Mitigating the Effects of Variable Illumination for Tracking across Disjoint Camera Views

  • Authors:
  • E. D. Cheng;C. Madden;M. Piccardi

  • Affiliations:
  • University of Technology, Australia;University of Technology, Australia;Senior Member, IEEE

  • Venue:
  • AVSS '06 Proceedings of the IEEE International Conference on Video and Signal Based Surveillance
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tracking people by their appearance across disjoint camera views is challenging since appearance may vary significantly across such views. This problem has been tackled in the past by computing intensity transfer functions between each camera pair during an initial training stage. However, in real-life situations, intensity transfer functions depend not only on the camera pair, but also on the actual illumination at pixel-wise resolution and may prove impractical to estimate to a satisfactory extent. For this reason, in this paper we propose an appearance representation for people tracking capable of coping with the typical illumination changes occurring in a surveillance scenario. Our appearance representation is based on an online K-means color clustering algorithm, a fixed, data-dependent intensity transformation, and the incremental use of frames. Moreover, a similarity measurement is proposed to match the appearance representations of any two given moving objects along sequences of frames. Experimental results presented in this paper show that the proposed methods provides a viable while effective approach for tracking people across disjoint camera views in typical surveillance scenarios.