Effective sounds in complex systems: the ARKOLA simulation
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
VoiceNotes: a speech interface for a hand-held voice notetaker
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
An evaluation of earcons for use in auditory human-computer interfaces
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
The design and evaluation of an auditory-enhanced scrollbar
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Parallel earcons: reducing the length of audio messages
International Journal of Human-Computer Studies
Designing SpeechActs: issues in speech user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earcons as a Method of Providing Navigational Cues in a Menu Hierarchy
HCI '96 Proceedings of HCI on People and Computers XI
“Making place” to make IT work: empirical explorations of HCI for mobile CSCW
GROUP '99 Proceedings of the international ACM SIGGROUP conference on Supporting group work
Overcoming the Lack of Screen Space on Mobile Computers
Personal and Ubiquitous Computing
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Auditory and tactile interfaces for representing the visual effects on the web
Proceedings of the fifth international ACM conference on Assistive technologies
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing Interaction Styles for a Mobile Use Context
HUC '99 Proceedings of the 1st international symposium on Handheld and Ubiquitous Computing
The human-computer interaction handbook
A paradigm shift: alternative interaction techniques for use with mobile & wearable devices
CASCON '03 Proceedings of the 2003 conference of the Centre for Advanced Studies on Collaborative research
Tactons: structured tactile messages for non-visual information display
AUIC '04 Proceedings of the fifth conference on Australasian user interface - Volume 28
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Non-visual information display using tactons
CHI '04 Extended Abstracts on Human Factors in Computing Systems
ACM Transactions on Applied Perception (TAP)
Nonvisual tool for navigating hierarchical structures
Assets '04 Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
Audio games: new perspectives on game audio
Proceedings of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology
Investigation of multi-modal interface features for adaptive automation of a human-robot system
International Journal of Human-Computer Studies
Pen-top feedback for paper-based interfaces
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data
ACM Transactions on Computer-Human Interaction (TOCHI)
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A user study of auditory versus visual interfaces for use while driving
International Journal of Human-Computer Studies
Peopletones: a system for the detection and notification of buddy proximity on mobile phones
Proceedings of the 6th international conference on Mobile systems, applications, and services
Mobile service audio notifications: intuitive semantics and noises
Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The implementation of a secure and pervasive multimodal Web system architecture
Information and Software Technology
SoundNet: investigating a language composed of environmental sounds
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Effects of multimodal feedback on the performance of older adults with normal and impaired vision
ERCIM'02 Proceedings of the User interfaces for all 7th international conference on Universal access: theoretical perspectives, practice, and experience
ERCIM'02 Proceedings of the User interfaces for all 7th international conference on Universal access: theoretical perspectives, practice, and experience
Influence of landmark-based navigation instructions on user attention in indoor smart spaces
Proceedings of the 16th international conference on Intelligent user interfaces
Name that tune: musicons as reminders in the home
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Maintaining and modifying pace through tactile and multimodal feedback
Interacting with Computers
Browsing web based documents through an alternative tree interface: the webtree browser
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
Enhancing visuospatial map learning through action on cellphones
ACM Transactions on Applied Perception (TAP)
A conceptual framework for camera phone-based interaction techniques
PERVASIVE'05 Proceedings of the Third international conference on Pervasive Computing
Overview of auditory representations in human-machine interfaces
ACM Computing Surveys (CSUR)
Computational Intelligence and Neuroscience
Hi-index | 0.00 |
This article describes 3 experiments that investigate the possibiity of using structured nonspeech audio messages called earcons to provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and 4 levels was created with an earcon for each node. Rules were defined for the creation of hierarchical earcons at each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results of the first experiment showed that participants could identify their location with 81.5% accuracy, indicating that earcons were a powerful method of communicating hierarchy information. One proposed use for such navigation cues is in telephone-based interfaces (TBIs) where navigation is a problem. The first experiment did not address the particular problems of earcons in TBIs such as “does the lower quality of sound over the telephone lower recall rates,” “can users remember earcons over a period of time.” and “what effect does training type have on recall?” An experiment was conducted and results showed that sound quality did lower the recall of earcons. However; redesign of the earcons overcame this problem with 73% recalled correctly. Participants could still recall earcons at this level after a week had passed. Training type also affected recall. With personal training participants recalled 73% of the earcons, but with purely textual training results were significantly lower. These results show that earcons can provide good navigation cues for TBIs. The final experiment used compound, rather than hierarchical earcons to represent the hierarchy from the first experiment. Results showed that with sounds constructed in this way participants could recall 97% of the earcons. These experiments have developed our general understanding of earcons. A hierarchy three times larger than any previously created was tested, and this was also the first test of the recall of earcons over time.