Investigating the effectiveness of tactile feedback for mobile touchscreens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
TeslaTouch: electrovibration for touch surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
BrailleTouch: designing a mobile eyes-free soft keyboard
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Blind people and mobile touch-based text-entry: acknowledging the need for different flavors
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
The today and tomorrow of Braille learning
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Hi-index | 0.00 |
Current touch interfaces lack the rich tactile feedback that allows blind users to detect and correct errors. This is especially relevant for multitouch interactions, such as Braille input. We propose HoliBraille, a system that combines touch input and multi-point vibrotactile output on mobile devices. We believe this technology can offer several benefits to blind users; namely, convey feedback for complex multitouch gestures, improve input performance, and support inconspicuous interactions. In this paper, we present the design of our unique prototype, which allows users to receive multitouch localized vibrotactile feedback. Preliminary results on perceptual discrimination show an average of 100% and 82% accuracy for single-point and chord discrimination, respectively. Finally, we discuss a text-entry application with rich tactile feedback.