Rendering in shift-invariant spaces

  • Authors:
  • Usman R. Alim

  • Affiliations:
  • University of Calgary

  • Venue:
  • Proceedings of Graphics Interface 2013
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel image representation method based on shift-invariant spaces. Unlike existing rendering methods, our proposed approach consists of two steps: an analog acquisition step that traces rays through the scene, and a subsequent digital processing step that filters the intermediate digital image to obtain the coefficients of a minimum-error continuous image approximation. Our approach can be easily incorporated in existing renderers with very little change and with little-to-no computational overhead. Additionally, we introduce the necessary tools needed to analyze the smoothing and post-aliasing properties of the minimum-error approximations. We provide examples of spaces --- generated by the uniform B-splines --- that can be readily used in conjunction with the two-dimensional Cartesian grid. Our experimental results demonstrate that minimum-error approximations significantly enhance image quality by preserving high-frequency details that are usually smoothed out by existing image anti-aliasing approaches.