A perceptual approach to trimming and tuning unstructured lumigraphs

  • Authors:
  • Yann Morvan;Carol O'sullivan

  • Affiliations:
  • Graphics, Vision and Visualisation Group, Trinity College Dublin;Graphics, Vision and Visualisation Group, Trinity College Dublin

  • Venue:
  • ACM Transactions on Applied Perception (TAP)
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel perceptual method to reduce the visual redundancy of unstructured lumigraphs, an image based representation designed for interactive rendering. We combine features of the unstructured lumigraph algorithm and image fidelity metrics to efficiently rank the perceptual impact of the removal of subregions of input views (subviews). We use a greedy approach to estimate the order in which subviews should be pruned to minimize perceptual degradation at each step. Renderings using varying numbers of subviews can then be easily visualized with confidence that the retained subviews are well chosen, thus facilitating the choice of how many to retain. The regions of the input views that are left are repacked into a texture atlas. Our method takes advantage of any scene geometry information available but only requires a very coarse approximation. We perform a user study to validate its behaviour, as well as investigate the impact of the choice of image fidelity metric as well as that of user parameters. The three metrics considered fall in the physical, statistical and perceptual categories. The overall benefit of our method is the semiautomation of the view selection process, resulting in unstructured lumigraphs that are thriftier in texture memory use and faster to render. Using the same framework, we adjust the parameters of the unstructured lumigraph algorithm to optimise it on a scene by scene basis.