Bidirectional lightcuts

  • Authors:
  • Bruce Walter;Pramook Khungurn;Kavita Bala

  • Affiliations:
  • Cornell University;Cornell University;Cornell University

  • Venue:
  • ACM Transactions on Graphics (TOG) - SIGGRAPH 2012 Conference Proceedings
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Scenes modeling the real-world combine a wide variety of phenomena including glossy materials, detailed heterogeneous anisotropic media, subsurface scattering, and complex illumination. Predictive rendering of such scenes is difficult; unbiased algorithms are typically too slow or too noisy. Virtual point light (VPL) based algorithms produce low noise results across a wide range of performance/accuracy tradeoffs, from interactive rendering to high quality offline rendering, but their bias means that locally important illumination features may be missing. We introduce a bidirectional formulation and a set of weighting strategies to significantly reduce the bias in VPL-based rendering algorithms. Our approach, bidirectional lightcuts, maintains the scalability and low noise global illumination advantages of prior VPL-based work, while significantly extending their generality to support a wider range of important materials and visual cues. We demonstrate scalable, efficient, and low noise rendering of scenes with highly complex materials including gloss, BSSRDFs, and anisotropic volumetric models.