Fusion of texture variation and on-line color sampling for moving object detection under varying chromatic illumination

  • Authors:
  • Chunfeng Shen;Xueyin Lin;Yuanchun Shi

  • Affiliations:
  • Key Lab of Pervasive Computing(MOE), Dept. of Computer Science & Technology, Tsinghua University, Beijing, P.R. China;Key Lab of Pervasive Computing(MOE), Dept. of Computer Science & Technology, Tsinghua University, Beijing, P.R. China;Key Lab of Pervasive Computing(MOE), Dept. of Computer Science & Technology, Tsinghua University, Beijing, P.R. China

  • Venue:
  • ACCV'06 Proceedings of the 7th Asian conference on Computer Vision - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a novel approach to non-rigid moving object detection under varying chromatic illumination is proposed. Different from most algorithms that utilize color information, the assumption of smooth or global change of illumination is no longer needed. Our method is based on the observation that the color appearance of objects may alter as the change of light intensity and color, but their texture structures remain almost the same. Therefore, texture based invariant characteristic to varying illumination is extracted and modeled, which can be used to guide for obtaining color appearance model at each frame. By this philosophy, firstly texture variation, which is not sensitive to illumination, is extracted by comparing the current image with background image. Secondly, the instantaneous color model is created by a special sampling algorithm according to the texture variation and previous consecutive detection results. By fusing texture variation and on-line color sampling, an energy function is founded and minimized to obtain the target contour. Experiments show that this approach has a great capability in detecting non-rigid objects under global or local varying illumination even when the hue and saturation of the lighting change abruptly or locally.