New models and methods for matting and compositing

  • Authors:
  • Yung-Yu Chuang;Brian Curless;David H. Salesin

  • Affiliations:
  • -;-;-

  • Venue:
  • New models and methods for matting and compositing
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Matting and compositing are fundamental operations in graphics and visual effects. Despite having enjoyed wide usage for many years, traditional matting and compositing have limitations. Traditional matting methods either require special setups or cannot handle objects with complex silhouettes. Furthermore, the traditional compositing model is effective in modeling color blending effects but not reflection, refraction, and shadows. In this dissertation, we address these limitations and present a set of new compositing models and matting methods. To pull mattes of complex silhouettes from natural images, we introduce a principled statistical approach called Bayesian image matting. We also extend this algorithm to handle video sequences with the help of optical flow computation and background estimation. On the compositing side, previous work on environment matting has been shown to handle refraction and reflection, but the resulting mattes are not very accurate. We propose a more accurate environment matting model and method that requires using more images. For shadows, we develop a physically-motivated shadow compositing equation. Based on this equation, we introduce a shadow matting method for extracting shadow mattes from videos with natural backgrounds, and we demonstrate a novel process for acquiring the photometric and geometric properties of the background to enable creation of realistic shadow composites. Finally, we present a novel application of Bayesian image matting for animating still pictures.