Reducing reversal errors in localizing the source of sound in virtual environment without head tracking

  • Authors:
  • Vladimir Ortega-González;Samir Garbaya;Frédéric Merienne

  • Affiliations:
  • Arts et Metiers ParisTech, CNRS, Le2i, Institut Image, Chalon-sur-Saône, France;Arts et Metiers ParisTech, CNRS, Le2i, Institut Image, Chalon-sur-Saône, France;Arts et Metiers ParisTech, CNRS, Le2i, Institut Image, Chalon-sur-Saône, France

  • Venue:
  • HAID'10 Proceedings of the 5th international conference on Haptic and audio interaction design
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a study about the effect of using additional audio cueing and Head-Related Transfer Function (HRTF) on human performance in sound source localization task without using head movement. The existing techniques of sound spatialization generate reversal errors. We intend to reduce these errors by introducing sensory cues based on sound effects. We conducted and experimental study to evaluate the impact of additional cues in sound source localization task. The results showed the benefit of combining the additional cues and HRTF in terms of the localization accuracy and the reduction of reversal errors. This technique allows significant reduction of reversal errors compared to the use of the HRTF separately. For instance, this technique could be used to improve audio spatial alerting, spatial tracking and target detection in simulation applications when head movement is not included.