Face illumination transfer through edge-preserving filters

  • Authors:
  • Xiaowu Chen; Mengmeng Chen; Xin Jin; Qinping Zhao

  • Affiliations:
  • State Key Lab. of Virtual Reality Technol. & Syst., Beihang Univ., Beijing, China;State Key Lab. of Virtual Reality Technol. & Syst., Beihang Univ., Beijing, China;State Key Lab. of Virtual Reality Technol. & Syst., Beihang Univ., Beijing, China;State Key Lab. of Virtual Reality Technol. & Syst., Beihang Univ., Beijing, China

  • Venue:
  • CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article proposes a novel image-based method to transfer illumination from a reference face image to a target face image through edge-preserving filters. According to our method, only a single reference image, without any knowledge of the 3D geometry or material information of the target face, is needed. We first decompose the lightness layers of the reference and the target images into large-scale and detail layers through weighted least square (WLS) filter after face alignment. The large-scale layer of the reference image is filtered with the guidance of the target image. Adaptive parameter selection schemes for the edge-preserving filters is proposed in the above two filtering steps. The final relit result is obtained by replacing the large-scale layer of the target image with that of the reference image. We acquire convincing relit result on numerous target and reference face images with different lighting effects and genders. Comparisons with previous work show that our method is less affected by geometry differences and can preserve better the identification structure and skin color of the target face.