A system for three-dimensional acoustic "visualization" in a virtual environment workstation

  • Authors:
  • Elizabeth M. Wenzel;Scott S. Fisher;Philip K. Stone;Scott H. Foster

  • Affiliations:
  • NASA-Ames Research Center, Moffett Field, CA;11571 Buena Vista Drive, Los Altos Hills, CA;NASA-Ames Research Center, Moffett Field, CA;Crystal River Engineering, Groveland, CA

  • Venue:
  • VIS '90 Proceedings of the 1st conference on Visualization '90
  • Year:
  • 1990

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the real time acoustic display capabilities developed for the VIrtual Environment Workstation (VIEW) project at NASA-Ames Research Center. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory "objects" or "icons," can be designed using ACE, the Auditory Cue Editor, which links both discrete and continuously-varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically co-ordinated in real time with three-dimensional visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.