Illumination-insensitive texture discrimination based on illumination compensation and enhancement

  • Authors:
  • Muwei Jian;Kin-Man Lam;Junyu Dong

  • Affiliations:
  • Centre for Signal Processing, Department of Electronic and Information Engineering, The Hong Kong Polytechnic University, Kowloon, Hong Kong;Centre for Signal Processing, Department of Electronic and Information Engineering, The Hong Kong Polytechnic University, Kowloon, Hong Kong;Department of Computer Science, Ocean University of China, Qingdao, China

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2014

Quantified Score

Hi-index 0.07

Visualization

Abstract

As the appearance of a 3D surface texture is strongly dependent on the illumination direction, 3D surface-texture classification methods need to employ multiple training images captured under a variety of illumination conditions for each class. Texture images under different illumination conditions and directions still present a challenge for texture-image retrieval and classification. This paper proposes an efficient method for illumination-insensitive texture discrimination based on illumination compensation and enhancement. Features extracted from an illumination-compensated or -enhanced texture are insensitive to illumination variation; this can improve the performance for texture classification. The proposed scheme learns the average illumination-effect matrix for image representation under changing illumination so as to compensate or enhance images and to eliminate the effect of different and uneven illuminations while retaining the intrinsic properties of the surfaces. The advantage of our method is that the assumption of a single-point light source is not required, so it circumvents and overcomes the limitations of the Lambertian model and is also suitable for outdoor settings. We use a wide range of textures in the PhoTex database in our experiments to evaluate the performance of the proposed method. Experimental results demonstrate the effectiveness of our proposed methods.