Target size study for one-handed thumb use on small touchscreen devices
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data
ACM Transactions on Computer-Human Interaction (TOCHI)
ACM Transactions on Accessible Computing (TACCESS)
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
OctoPocus: a dynamic guide for learning gesture-based command sets
Proceedings of the 21st annual ACM symposium on User interface software and technology
SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
ACM Transactions on Accessible Computing (TACCESS)
Natural user interfaces are not natural
interactions
Timbremap: enabling the visually-impaired to use maps on touch-enabled devices
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Usable gestures for blind people: understanding preference and performance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
On the audio representation of radial direction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Signing on the tactile line: A multimodal system for teaching handwriting to blind children
ACM Transactions on Computer-Human Interaction (TOCHI)
Continuous recognition and visualization of pen strokes and touch-screen gestures
Proceedings of the Eighth Eurographics Symposium on Sketch-Based Interfaces and Modeling
Brailletouch: mobile texting for the visually impaired
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: context diversity - Volume Part III
Access overlays: improving non-visual access to large touch screens for blind users
Proceedings of the 24th annual ACM symposium on User interface software and technology
No-look notes: accessible eyes-free multi-touch text entry
Pervasive'10 Proceedings of the 8th international conference on Pervasive Computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Input finger detection for nonvisual touch screen text entry in Perkinput
Proceedings of Graphics Interface 2012
Interacting with mobile devices via VoiceOver: usability and accessibility issues
Proceedings of the 24th Australian Computer-Human Interaction Conference
Hi-index | 0.00 |
While sighted users may learn to perform touchscreen gestures through observation (e.g., of other users or video tutorials), such mechanisms are inaccessible for users with visual impairments. As a result, learning to perform gestures can be challenging. We propose and evaluate two techniques to teach touchscreen gestures to users with visual impairments: (1) corrective verbal feedback using text-to-speech and automatic analysis of the user's drawn gesture; (2) gesture sonification to generate sound based on finger touches, creating an audio representation of a gesture. To refine and evaluate the techniques, we conducted two controlled lab studies. The first study, with 12 sighted participants, compared parameters for sonifying gestures in an eyes-free scenario and identified pitch + stereo panning as the best combination. In the second study, 6 blind and low-vision participants completed gesture replication tasks with the two feedback techniques. Subjective data and preliminary performance findings indicate that the techniques offer complementary advantages.