A comprehensive framework for auditory display: Comments on Barrass, ICAD 1994

  • Authors:
  • Stephen Barrass

  • Affiliations:
  • University of Canberra, Australia

  • Venue:
  • ACM Transactions on Applied Perception (TAP)
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In ‘A Perceptual Framework for the Auditory Display of Scientific Data’ I described the first perceptually scaled sound space designed specifically for sonification. I modeled this sound space, and its underlying theory, on the use of perceptual colour spaces in scientific visualization. As I went on to apply the sound space in mappings of satellite data I introduced methods of data characterization and user-centered task analysis into my design framework. In trials I realized that satellite images allow you to see global information across millions of data values, whereas it was impossible to play all the data at once as sounds. This lead me to explore perceptual streaming as a means for perceiving similarity and difference in masses of sounded data. In work on sonifications for virtual reality applications I recognized the need to consider the semiotic linkage of the sound with the application domain, and the need to also link the sound with the interaction metaphor.The work described in this paper laid the foundation for the ongoing development of a comprehensive framework for auditory display that takes into consideration the perceptual organization of the sounds, the characteristics of the data, the gamut of the display device, the user's tasks, the semiotic linkage to the application domain, and the affordances for interaction.