GPU-based acoustical occlusion modeling with acoustical texture maps
Proceedings of the 6th Audio Mostly Conference: A Conference on Interaction with Sound
Acoustic Rendering and Auditory–Visual Cross-Modal Perception and Interaction
Computer Graphics Forum
Hi-index | 0.00 |
In typical environments, the direct path between a sound source and a listener is often occluded. However, due to the phenomenon of diffraction, sound still reaches the listener by “bending” around an obstacle that lies directly in the line of straight propagation. Modeling occlusion/diffraction effects is a difficult and computationally intensive task and thus generally ignored in virtual reality and videogame applications. Driven by the gaming industry, consumer computer graphics hardware and the graphics processing unit (GPU) in particular, have greatly advanced in recent years, outperforming the computational capacity of central processing units. Given the affordability, widespread use, and availability of computer graphics hardware, here we describe a computationally efficient GPU-based method that approximates acoustical occlusion/diffraction effects in real time. Although the method has been developed primarily for videogames where occlusion/diffraction is typically overlooked, it is relevant for dynamic and interactive virtual environments as well.