Investigating background & foreground interactions using spatial audio cues

  • Authors:
  • Yolanda Vazquez-Alvarez;Stephen Brewster

  • Affiliations:
  • University of Glasgow, Glasgow, United Kingdom;University of Glasgow, Glasgow, United Kingdom

  • Venue:
  • CHI '09 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Audio is a key feedback mechanism in eyes-free and mobile computer interaction. Spatial audio, which allows us to localize a sound source in a 3D space, can offer a means of altering focus between audio streams as well as increasing the richness and differentiation of audio cues. However, the implementation of spatial audio on mobile phones is a recent development. Therefore, a calibration of this new technology is a requirement for any further spatial audio research. In this paper we report an evaluation of the spatial audio capabilities supported on a Nokia N95 8GB mobile phone. Participants were able to significantly discriminate between five audio sources on the frontal horizontal plane. Results also highlighted possible subject variation caused by earedness and handedness. We then introduce the concept of audio minimization and describe work in progress using the Nokia N95's 3D audio capability to implement and evaluate audio minimization in an eyes-free mobile environment.