Bimodal perception of audio-visual material properties for virtual environments

  • Authors:
  • Nicolas Bonneel;Clara Suied;Isabelle Viaud-Delmon;George Drettakis

  • Affiliations:
  • REVES/INRIA Sophia-Antipolis;CNRS-UPMC UMR 7593;CNRS-UPMC UMR 7593;REVES/INRIA Sophia-Antipolis

  • Venue:
  • ACM Transactions on Applied Perception (TAP)
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and bidirectional reflectance distribution functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (plastic and gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge, this is the first study that shows interaction of audio and graphics representation in a material perception task.