Illumination normalization for robust face recognition using discrete wavelet transform

  • Authors:
  • Amnart Petpon;Sanun Srisuk

  • Affiliations:
  • Department of Computer Engineering, Mahanakorn University of Technology, Bangkok, Thailand;Department of Computer Engineering, Mahanakorn University of Technology, Bangkok, Thailand

  • Venue:
  • ISVC'10 Proceedings of the 6th international conference on Advances in visual computing - Volume Part III
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we introduce an illumination normalization approach within frequency domain by utilizing Discrete Wavelet Transform (DWT) as a transformation function in order to suppress illumination variations and simultaneously amplify facial feature such as eyeball, eyebrow, nose, and mouth. The basic ideas are: 1) transform a face image from spatial domain into frequency domain and then obtain two major components, approximate coefficient (Low frequency) and detail coefficient (High frequency) separately 2) remove total variation in an image by adopting Total Variation Quotient Image (TVQI) or Logarithmic Total Variation (LTV) 3) amplify facial features, which are the significant key for face classification, by adopting Gaussian derivatives and Morphological operators respectively. The efficiency of our proposed approach is evaluated based on a public face database, Yale Face Database B, and its extend version, Extend Yale Face Database B. Our experimental results are demonstrated that the proposed approach archives high recognition rate even though only single image per person was used as the training set.