Improving the usability of speech-based interfaces for blind users
Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
Lessons from developing audio HTML interface
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
User interface of a Home Page Reader
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
MULTIMEDIA '99 Proceedings of the seventh ACM international conference on Multimedia (Part 1)
Human-Computer Interaction
Maximising screen-space on mobile computing devices
CHI '99 Extended Abstracts on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earcons as a Method of Providing Navigational Cues in a Menu Hierarchy
HCI '96 Proceedings of HCI on People and Computers XI
Ubiquity
Designing help topics for use with text-to-speech
SIGDOC '06 Proceedings of the 24th annual ACM international conference on Design of communication
Computers and People with Disabilities
ACM Transactions on Accessible Computing (TACCESS)
ACM Transactions on Accessible Computing (TACCESS)
Advanced auditory menus: design and evaluation of auditory scroll bars
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Transforming graphical interfaces into auditory interfaces for blind users
Human-Computer Interaction
SWAN: System for Wearable Audio Navigation
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Spearcon Performance and Preference for Auditory Menus on a Mobile Phone
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping, Wheeling, and Flicking
ACM Transactions on Computer-Human Interaction (TOCHI)
Advanced auditory cues on mobile phones help keep drivers' eyes on the road
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Overview of auditory representations in human-machine interfaces
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
Users interact with mobile devices through menus, which can include many items. Auditory menus have the potential to make those devices more accessible to a wide range of users. However, auditory menus are a relatively new concept, and there are few guidelines that describe how to design them. In this paper, we detail how visual menu concepts may be applied to auditory menus in order to help develop design guidelines. Specifically, we examine how to optimize the designs of a new contextual cue, called “spindex” (i.e., speech index). We developed and evaluated various design alternatives for spindex and iteratively refined the design with sighted users and visually impaired users. As a result, the “attenuated” spindex was the best in terms of preference as well as performance, across user groups. Nevertheless, sighted and visually impaired participants showed slightly different responses and feedback. Results are discussed in terms of acoustical theory, practical display design, and assistive technology design.