Perception-based global illumination, rendering, and animation techniques

  • Authors:
  • Karol Myszkowski

  • Affiliations:
  • Max-Planck-Institut für Informatik, Germany

  • Venue:
  • SCCG '02 Proceedings of the 18th spring conference on Computer graphics
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we consider applications of perception-based video quality metrics to improve the performance of global lighting computations and rendering of animation sequences. To control the computation of animation frames we use the Animation Quality Metric (AQM) which is an extended version of the Visible Difference Predictor (VDP) developed by Daly. We show two applications of the AQM: (1) the rendering of high-quality walk-throughs for static environments and (2) the computation of global illumination for dynamic environments.To improve the rendering performance of our walk-through solution we use a hybrid of the ray tracing and Image-Based Rendering (IBR) techniques. In our rendering solution we derive as many pixels as possible using inexpensive IBR techniques without affecting the animation quality. The AQM is used to automatically guide such a hybrid rendering.Also, we present a method for efficient global illumination computation in dynamic environments by taking advantage of temporal coherence of lighting distribution. The method is embedded in the framework of stochastic photon tracing and density estimation techniques. The AQM is used to keep noise inherent in stochastic methods below the sensitivity level of the human observer. As a result a perceptually-consistent quality across all animation frames is obtained. Furthermore, the computation cost is reduced compared to the traditional approaches operating solely in the spatial domain.