Capturing and rendering with incident light fields

  • Authors:
  • J. Unger;A. Wenger;T. Hawkins;A. Gardner;P. Debevec

  • Affiliations:
  • Linköping University Norrköping Visualization and Interaction Studio, Sweden;University of Southern California Institute for Creative Technologies, United States;University of Southern California Institute for Creative Technologies, United States;University of Southern California Institute for Creative Technologies, United States;University of Southern California Institute for Creative Technologies, United States

  • Venue:
  • EGRW '03 Proceedings of the 14th Eurographics workshop on Rendering
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the same illumination.