Interfaces for advanced computing
Scientific American
Development of a three-dimensional auditory display system
ACM SIGCHI Bulletin
Grasping reality through illusion—interactive graphics serving science
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An experiment into the use of auditory cues to reduce visual workload
CHI '89 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Stereophonic and surface sound generation for exploratory data analysis
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Virtual environment display system
I3D '86 Proceedings of the 1986 workshop on Interactive 3D graphics
Sound and computer information presentation
Sound and computer information presentation
Proceedings of the 7th conference on Visualization '96
Auralization of streamline vorticity in computational fluid dynamics data
VIS '97 Proceedings of the 8th conference on Visualization '97
Data Sonification and Sound Visualization
Computing in Science and Engineering
Usage of multisensory information in scientific data sensualization
Multimedia Systems - Special issue on multimedia and multisensory virtual worlds
Menu selection using auditory interface
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Speech intelligibility in adverse conditions in recorded virtual auditory environments
ICAD'98 Proceedings of the 1998 international conference on Auditory Display
Surgical navigation system and method using audio feedback
ICAD'98 Proceedings of the 1998 international conference on Auditory Display
Hi-index | 0.00 |
This paper describes the real time acoustic display capabilities developed for the VIrtual Environment Workstation (VIEW) project at NASA-Ames Research Center. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory "objects" or "icons," can be designed using ACE, the Auditory Cue Editor, which links both discrete and continuously-varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically co-ordinated in real time with three-dimensional visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.