On uniform resampling and gaze analysis of bidirectional texture functions

  • Authors:
  • Jiří Filip;Michael J. Chantler;Michal Haindl

  • Affiliations:
  • Heriot-Watt University and Institute of Information Theory and Automation of the AS CR;Heriot-Watt University;Institute of Information Theory and Automation of the AS CR

  • Venue:
  • ACM Transactions on Applied Perception (TAP)
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The use of illumination and view-dependent texture information is recently the best way to capture the appearance of real-world materials accurately. One example is the Bidirectional Texture Function. The main disadvantage of these data is their massive size. In this article, we employ perceptually-based methods to allow more efficient handling of these data. In the first step, we analyse different uniform resampling by means of a psychophysical study with 11 subjects, comparing original data with rendering of a uniformly resampled version over the hemisphere of illumination and view-dependent textural measurements. We have found that down-sampling in view and illumination azimuthal angles is less apparent than in elevation angles and that illumination directions can be down-sampled more than view directions without loss of visual accuracy. In the second step, we analyzed subjects gaze fixation during the experiment. The gaze analysis confirmed results from the experiment and revealed that subjects were fixating at locations aligned with direction of main gradient in rendered stimuli. As this gradient was mostly aligned with illumination gradient, we conclude that subjects were observing materials mainly in direction of illumination gradient. Our results provide interesting insights in human perception of real materials and show promising consequences for development of more efficient compression and rendering algorithms using these kind of massive data.