A lazy object-space shading architecture with decoupled sampling

  • Authors:
  • Christopher A. Burns;Kayvon Fatahalian;William R. Mark

  • Affiliations:
  • Intel Labs;Stanford University;Intel Labs

  • Venue:
  • Proceedings of the Conference on High Performance Graphics
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We modify the Reyes object-space shading approach to address two inefficiencies that result from performing shading calculations at micropolygon grid vertices prior to rasterization. Our system samples shading of surface sub-patches uniformly in the object's parametric domain, but the location of shading samples need not correspond with the location of mesh vertices. Thus we perform object-space shading that efficiently supports motion and defocus blur, but do not require micropolygons to achieve a shading rate of one sample per pixel. Second, our system resolves surface visibility prior to shading, then lazily shades 2x2 sample blocks that are known to contribute to the resulting fragments. We find that in comparison to a Reyes micropolygon rendering pipeline, decoupling geometric sampling rate from shading rate permits the use of meshes containing an order of magnitude fewer vertices with minimal loss of image quality in our test scenes. Shading on-demand after rasterization reduces shader invocations by over two times in comparison to pre-visibility object-space shading.