Specularity removal in images and videos: a PDE approach

  • Authors:
  • Satya P. Mallick;Todd Zickler;Peter N. Belhumeur;David J. Kriegman

  • Affiliations:
  • Computer Science and Engineering, University of California at San Diego, CA;Engineering and Applied Sciences, Harvard University, Cambridge, MA;Computer Science, Columbia University, New York, NY;Computer Science and Engineering, University of California at San Diego, CA

  • Venue:
  • ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a unified framework for separating specular and diffuse reflection components in images and videos of textured scenes. This can be used for specularity removal and for independently processing, filtering, and recombining the two components. Beginning with a partial separation provided by an illumination-dependent color space, the challenge is to complete the separation using spatio-temporal information. This is accomplished by evolving a partial differential equation (PDE) that iteratively erodes the specular component at each pixel. A family of PDEs appropriate for differing image sources (still images vs. videos), differing prior information (e.g., highly vs. lightly textured scenes), or differing prior computations (e.g., optical flow) is introduced. In contrast to many other methods, explicit segmentation and/or manual intervention are not required. We present results on high-quality images and video acquired in the laboratory in addition to images taken from the Internet. Results on the latter demonstrate robustness to low dynamic range, JPEG artifacts, and lack of knowledge of illuminant color. Empirical comparison to physical removal of specularities using polarization is provided. Finally, an application termed dichromatic editing is presented in which the diffuse and the specular components are processed independently to produce a variety of visual effects.