Adaptive Spatial Sample Caching

  • Authors:
  • Andreas Dietrich;Philipp Slusallek

  • Affiliations:
  • Computer Graphics Group, Saarland University, Germany, e-mail: dietrich@cs.uni-sb.de;Computer Graphics Group, Saarland University, Germany, e-mail: slusallek@cs.uni-sb.de

  • Venue:
  • RT '07 Proceedings of the 2007 IEEE Symposium on Interactive Ray Tracing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Despite tremendous progress in the last few years, realistically rendering 3D scenes with advanced shading and particularly global illumination is still extremely difficult to perform at interactive rates. Precomputation techniques can reduce the cost during rendering, however, they typically require long preprocessing times and high storage requirements. We propose a novel world-space sample caching approach for walkthroughs of static scenes that does not require precomputation and relies instead on aggressive caching. During run-time, pixelsized patches projected adaptively onto the tangent space of visible triangles are used to store the results of shading computations. Patches are organized in a cache, which only requires a small and fixed memory footprint. In subsequent frames these patches can be retrieved, thus exploiting frame-to-frame coherence. In contrast to previous caching methods, the presented technique is extremely easy to implement, and requires only a few dozen lines of code. This caching mechanism can reduce the number of secondary rays for subsequent frames by more than an order of magnitude for moderate camera movements.