Earcons and icons: their structure and common design principles
Human-Computer Interaction
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
A system for three-dimensional acoustic "visualization" in a virtual environment workstation
VIS '90 Proceedings of the 1st conference on Visualization '90
Multimedia environments for scientists
VIS '91 Proceedings of the 2nd conference on Visualization '91
HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces
HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces
ACM Transactions on Applied Perception (TAP)
Immersive Visual Data Mining: The 3DVDM Approach
Visual Data Mining
How effective is it to design by voice?
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2
Efficiency of multimodal metaphors in the presentation of learning information
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
The use of multimodality metaphors in e-learning
SEPADS'09 Proceedings of the 8th WSEAS International Conference on Software engineering, parallel and distributed systems
Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Exploring the general-purpose visual alternative
Journal of Visual Languages and Computing
SensorTune: a mobile auditory interface for DIY wireless sensor networks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Utilizing sound effects in mobile user interface design
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
Guidelines for edutainment in e-learning systems
SEPADS'11 Proceedings of the 10th WSEAS international conference on Software engineering, parallel and distributed systems
Hi-index | 0.00 |
The potential utility of dividing the information flowing from computer to human among several sensory modalities is investigated by means of a rigorous experiment which compares the effectiveness of auditory and visual cues in the performance of a visual search task. The results indicate that a complex auditory cue can be used to replace cues traditionally presented in the visual modality. Implications for the design of multimodal workstations are discussed.