Variance Invariant Adaptive Temporal Supersampling for Motion Blurring

  • Authors:
  • Daniel Neilson;Yee-Hong Yang

  • Affiliations:
  • -;-

  • Venue:
  • PG '03 Proceedings of the 11th Pacific Conference on Computer Graphics and Applications
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Adaptive temporal sampling, used to create motion blur in distributed ray tracing, generates more sample points in regions with motion blur than in regions without motion blur. When the number of sample points used on stationary objects in regions with motion blur exceeds the number of sample points used in other regions of the image, the variance in the colour of the object can differ between the two regions. This paper identifies the cause of this variance discrepancy, and proposes a modification to existing adaptive temporal sampling algorithms which eliminates it. Our results demonstrate that the variance of stationary objects remainsapproximately the same throughout the entire image and that the proposed modification is capable of improving the running time of existing adaptive temporal sampling algorithms.