Perceived aliasing thresholds in high-fidelity rendering

  • Authors:
  • Veronica Sundstedt;Kurt Debattista;Alan Chalmers

  • Affiliations:
  • University of Bristol;University of Bristol;University of Bristol

  • Venue:
  • APGV '05 Proceedings of the 2nd symposium on Applied perception in graphics and visualization
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

High-fidelity rendering is very computationally expensive making it difficult to achieve interactive rates except for simple scenes. Recent selective rendering techniques [Yee et al. 2001; Cater et al. 2003], which alter the number of rays cast per pixel, have been explored to achieve this goal. These approaches have shown that rendering times can be significantly reduced without perceptual degradation. In traditional ray-traced images aliasing is removed by supersampling the image plane. In this sketch we identify the threshold at which decreasing the number or rays shot per pixel would result in no perceptual degradation. We conduct psychophysical experiments using four realistic environments and one test environment as a comparison. This test scene was designed to exhibit high spatial frequencies and thus was a worst case for aliasing. We determine the computational bound by varying the number of rays shot per pixel in both still images and animations. The lighting simulation system Radiance [Ward 1994] is adapted for use in these experiments. The results can be used in the design of more effective perceptual selective rendering algorithms; using the computatinal bound as an indication of where to threshold the rendering process [Sundstedt et al. 2005]. Selective rendering will alter this threshold based on perceptual importance of pixels within the image. This will reduce computation time while maintaining a perceptually high quality result for realistic scenes.