Evaluating environmental sounds from a presence perspective for virtual reality applications

  • Authors:
  • Rolf Nordahl

  • Affiliations:
  • Medialogy, Aalborg University Copenhagen, Ballerup, Denmark

  • Venue:
  • EURASIP Journal on Audio, Speech, and Music Processing - Special issue on environmental sound synthesis, processing, and retrieval
  • Year:
  • 2010
  • Sound in COLLADA

    ICEC'11 Proceedings of the 10th international conference on Entertainment Computing

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a methodology to design and evaluate environmental sounds for virtual environments. We propose to combine physically modeled sound events with recorded soundscapes. Physical models are used to provide feedback to users' actions, while soundscapes reproduce the characteristic soundmarks of an environment. In this particular case, physical models are used to simulate the act of walking in the botanical garden of the city of Prague, while soundscapes are used to reproduce the particular sound of the garden. The auditory feedback designed was combined with a photorealistic reproduction of the same garden. A between-subject experiment was conducted, where 126 subjects participated, involving six different experimental conditions, including both uni- and bimodal stimuli (auditory and visual). The auditory stimuli consisted of several combinations of auditory feedback, including static sound sources as well as self-induced interactive sounds simulated using physical models. Results show that subjects' motion in the environment is significantly enhanced when dynamic sound sources and sound of egomotion are rendered in the environment.