GPU-based acoustical occlusion modeling with acoustical texture maps

  • Authors:
  • Brent Cowan;Bill Kapralos

  • Affiliations:
  • University of Ontario Institute of Technology, Oshawa, Ontario, Canada;University of Ontario Institute of Technology, Oshawa, Ontario, Canada

  • Venue:
  • Proceedings of the 6th Audio Mostly Conference: A Conference on Interaction with Sound
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Although the direct path between a sound source and a receiver is often occluded, sound may still reach the receiver as it diffracts ("bends") around the occluding obstacle/object. Diffraction is an elementary means of sound propagation yet, despite its importance, it is often ignored in virtual reality and gaming applications altogether except perhaps for trivial environments. Given the widespread use and availability of computer graphics hardware and the graphics processing unit (GPU) in particular, GPUs have been successfully applied to other, non-graphics applications including audio processing and acoustical diffraction modeling. Here we build upon our previous work that approximates acoustical occlusion/diffraction effects in real-time utilizing the GPU. In contrast to our previous approach, the audio properties of an object are stored as a texture map and this allows the properties to vary across the surface of a model. The method is computationally efficient allowing it to be incorporated into real-time, dynamic, and interactive virtual environments and video games where the scene is arbitrarily complex.