Hybrid forward resampling and volume rendering

  • Authors:
  • Xiaoru Yuan;Minh X. Nguyen;Hui Xu;Baoquan Chen

  • Affiliations:
  • University of Minnesota at Twin Cities;University of Minnesota at Twin Cities;University of Minnesota at Twin Cities;University of Minnesota at Twin Cities

  • Venue:
  • VG '03 Proceedings of the 2003 Eurographics/IEEE TVCG Workshop on Volume graphics
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The transforming and rendering of discrete objects, such as traditional images (with or without depths) and volumes, can be considered as resampling problem -- objects are reconstructed, transformed, filtered, and finally sampled on the screen grids. In resampling practices, discrete samples (pixels, voxels) can be considered either as infinitesimal sample points (simply called points) or samples of a certain size (splats). Resampling can also be done either forwards or backwards in either the source domain or the target domain. In this paper, we present a framework that features hybrid forward resampling for discrete rendering. Specifically, we apply this framework to enhance volumetric splatting. In this approach, minified voxels are taken simply as points filtered in screen space; while magnified voxels are taken as spherical splats. In addition, we develop two techniques for performing accurate and efficient perspective splatting. The first one is to efficiently compute the 2D elliptical geometry of perspectively projected splats; the second one is to achieve accurate perspective reconstruction filter. The results of our experiments demonstrate both the effectiveness of antialiasing and the efficiency of rendering using this approach.