An audio glance at syntactic structure based on spoken form
ICCHP '96 Proceedings of the 5th international conference on Computers helping people with special needs. Part II
The human-computer interaction handbook
TiltText: using tilt for text input to mobile phones
Proceedings of the 16th annual ACM symposium on User interface software and technology
Dynamics of tilt-based browsing on mobile devices
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Spearcon Performance and Preference for Auditory Menus on a Mobile Phone
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
User modeling to support the development of an auditory help system
TSD'07 Proceedings of the 10th international conference on Text, speech and dialogue
BCS-HCI '12 Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers
Hi-index | 0.00 |
As mobile-phone design moves toward a touch-screen form factor, the visually disabled are faced with new accessibility challenges. The mainstream interaction model for touch-screen devices relies on the user having the ability to see spatially arranged visual icons, and to interface with these icons via a smooth glass screen. An inherent challenge for blind users with this type of interface is its lack of tactile feedback. In this paper we explore the concept of using a combination of spatial audio and accelerometer technology to enable blind users to effectively operate a touch-screen device. We discuss the challenges involved in representing icons using sound and we introduce a design framework that is helping us tease out some of these issues. We also outline a set of proposed user-studies that will test the effectiveness of our design using a Nokia N97. The results of these studies will be presented at ICCHP 2010.