How to tell people where to go: comparing navigational aids
International Journal of Man-Machine Studies
Codes and modalities in multiple resources: a success and a qualification
Human Factors - Special Issue: Human information processing: theory and applications
An experiment into the use of auditory cues to reduce visual workload
CHI '89 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An assessment of moving map and symbol-based route guidance systems
Ergonomics and safety of intelligent driver interfaces
Auditory Display: Sonification, Audification and Auditory Interfaces
Auditory Display: Sonification, Audification and Auditory Interfaces
Earcons and icons: their structure and common design principles
Human-Computer Interaction
Voice interfaced vehicle user help
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hold that thought: are spearcons less disruptive than spoken reminders?
CHI '12 Extended Abstracts on Human Factors in Computing Systems
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping, Wheeling, and Flicking
ACM Transactions on Computer-Human Interaction (TOCHI)
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Content matters: towards handling e-mail while driving safely
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
The voice user help, a smart vehicle assistant for the elderly
UCAmI'12 Proceedings of the 6th international conference on Ubiquitous Computing and Ambient Intelligence
Eyes-free interaction with free-hand gestures and auditory menus
International Journal of Human-Computer Studies
Advanced auditory cues on mobile phones help keep drivers' eyes on the road
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Comparing three novel multimodal touch interfaces for infotainment menus
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Hi-index | 0.00 |
Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory in-vehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user's position in a long menu list). Twenty-four undergraduates navigated through an alphabetized song list of 150 song titles---rendered as an auditory menu---while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of the game) and the secondary task (menu search time) were better with the auditory menus than with no sound. Subjective workload scores (NASA TLX) and user preferences favored the enhanced auditory cue types. Results are discussed in terms of multiple resources theory and practical IVT design applications.