Integrating real-time binaural acoustics into VR applications

  • Authors:
  • I. Assenmacher;T. Kuhlen;T. Lentz;M. Vorländer

  • Affiliations:
  • Center for Computing and Communication RWTH Aachen University;Center for Computing and Communication RWTH Aachen University;Institute of Technical Acoustics RWTH Aachen University;Institute of Technical Acoustics RWTH Aachen University

  • Venue:
  • EGVE'04 Proceedings of the Tenth Eurographics conference on Virtual Environments
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Common research in the field of Virtual Reality (VR) considers acoustic stimulation as a highly important necessity for enhanced immersion into virtual scenes. However, most common VR toolkits do only marginally support the integration of sound for the application programmer. Furthermore, the quality of stimulation that is provided usually ranges from system sounds (e.g. beeps while selecting a menu) to simple 3D panning. In the latter case, these approaches do only allow the user to correctly detect sounds that are at quite a distance from his current position. Binaural synthesis is an interesting way to allow the spatial auditory representation by using few loudspeakers or headphones. This paper describes a system that combines the efforts of creating a binaural representation for the listener who is interacting in a common visual VR application in real-time, thus allowing the research on interaction between visual and auditory human perception systems. It will describe the theoretical background to establishing a binaural representation of a sound and the necessary hardware set-up for this. Afterwards, the infrastructure and software interface which will allow the connection of the audio renderer to a visual VR toolkit is discussed.