Illumination normalization with time-dependent intrinsic images for video surveillance

  • Authors:
  • Yasuyuki Matsushita;Ko Nishino;Katsushi Ikeuchi;Masao Sakauchi

  • Affiliations:
  • Institute of Industrial Science, The University of Tokyo, Tokyo, Japan;Department of Computer Science, Columbia University, New York;Institute of Industrial Science, The University of Tokyo, Tokyo, Japan;Institute of Industrial Science, The University of Tokyo, Tokyo, Japan

  • Venue:
  • CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Cast shadows produce troublesome effects for video surveillance systems, typically for object tracking from a fixed viewpoint, since it yields appearance variations of objects depending on whether they are inside or outside the shadow. To robustly eliminate these shadows from image sequences as a preprocessing stage for robust video surveillance, we propose a framework based on the idea of intrinsic images. Unlike previous methods for deriving intrinsic images, we derive time-varying reflectance images and corresponding illumination images from a sequence of images. Using obtained illumination images, we normalize the input image sequence in terms of incident lighting distribution to eliminate shadow effects. We also propose an illumination normalization scheme which can potentially run in real time, utilizing the illumination eigenspace, which captures the illumination variation due to weather, time of day etc., and a shadow interpolation method based on shadow hulls. This paper describes the theory of the framework with simulation results, and shows its effectiveness with object tracking results on real scene data sets for traffic monitoring.