Interactive sound synthesis for large scale environments

  • Authors:
  • Nikunj Raghuvanshi;Ming C. Lin

  • Affiliations:
  • University of North Carolina at Chapel Hill;University of North Carolina at Chapel Hill

  • Venue:
  • I3D '06 Proceedings of the 2006 symposium on Interactive 3D graphics and games
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an interactive approach for generating realistic physically-based sounds from rigid-body dynamic simulations. We use spring-mass systems to model each object's local deformation and vibration, which we demonstrate to be an adequate approximation for capturing physical effects such as magnitude of impact forces, location of impact, and rolling sounds. No assumption is made about the mesh connectivity or topology. Surface meshes used for rigid-body dynamic simulation are utilized for sound simulation without any modifications. We use results in auditory perception and a novel priority-based quality scaling scheme to enable the system to meet variable, stringent time constraints in a real-time application, while ensuring minimal reduction in the perceived sound quality. With this approach, we have observed up to an order of magnitude speed-up compared to an implementation without the acceleration. As a result, we are able to simulate moderately complex simulations with upto hundreds of sounding objects at over 100 frames per second (FPS), making this technique well suited for interactive applications like games and virtual environments. Furthermore, we utilize OpenAL and EAX™ on Creative Sound Blaster Audigy 2™ cards for fast hardware-accelerated propagation modeling of the synthesized sound.