Image-based bidirectional scene reprojection

  • Authors:
  • Lei Yang;Yu-Chiu Tse;Pedro V. Sander;Jason Lawrence;Diego Nehab;Hugues Hoppe;Clara L. Wilkins

  • Affiliations:
  • Hong Kong UST;Hong Kong UST;Hong Kong UST;University of Virginia;Microsoft Research, and IMPA;Microsoft Research;Wesleyan University

  • Venue:
  • Proceedings of the 2011 SIGGRAPH Asia Conference
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a method for increasing the framerate of real-time rendering applications. Whereas many existing temporal upsampling strategies only reuse information from previous frames, our bidirectional technique reconstructs intermediate frames from a pair of consecutive rendered frames. This significantly improves the accuracy and efficiency of data reuse since very few pixels are simultaneously occluded in both frames. We present two versions of this basic algorithm. The first is appropriate for fill-bound scenes as it limits the number of expensive shading calculations, but involves rasterization of scene geometry at each intermediate frame. The second version, our more significant contribution, reduces both shading and geometry computations by performing reprojection using only image-based buffers. It warps and combines the adjacent rendered frames using an efficient iterative search on their stored scene depth and flow. Bidirectional reprojection introduces a small amount of lag. We perform a user study to investigate this lag, and find that its effect is minor. We demonstrate substantial performance improvements (3--4x) for a variety of applications, including vertex-bound and fill-bound scenes, multi-pass effects, and motion blur.