Robust foreground detection in videos using adaptive color histogram thresholding and shadow removal

  • Authors:
  • Akintola Kolawole;Alireza Tavakkoli

  • Affiliations:
  • University of Houston-Victoria;University of Houston-Victoria

  • Venue:
  • ISVC'11 Proceedings of the 7th international conference on Advances in visual computing - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Fundamental to advance video processing such as object tracking, gait recognition and video indexing is the issue of robust background and foreground segmentation. Several methods have been explored regarding this application, but they are either time or memory consuming or not so efficient in segmentation. This paper proposes an accurate and fast foreground detection technique for object tracking in videos with quasi-stationary backgrounds. The background is modeled using a novel real-time kernel density estimations approach based on online histogram learning. It is noted that shadows are classified as part of foreground pixels if further processing on illumination conditions of the foreground regions is not performed. A developed morphological approach to remove shadows from the segmented foreground image is used. The main contribution of the proposed foreground detection approach is its low memory requirements, low processing time, suitability for parallel processing, and accurate segmentation. The technique has been tested on a variety of both indoor and outdoor sequences for segmentation of foreground and background. The data is structured in such a way that it could be processed using multi-core parallel processing architectures. Tests on dual and quad core processors proved the two and four times speed up factors achieved by distributing the system on parallel hardware architectures. A potential direction for the proposed approach is to investigate its performance on a CUDA enabled Graphic Processing Unit (GPU) as parallel processing capabilities are built into our architecture.