The travails of visually impaired web travellers
HYPERTEXT '00 Proceedings of the eleventh ACM on Hypertext and hypermedia
Tactons: structured tactile messages for non-visual information display
AUIC '04 Proceedings of the fifth conference on Australasian user interface - Volume 28
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Feel-good touch: finding the most pleasant tactile feedback for a mobile touch screen button
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Investigating touchscreen accessibility for people with visual impairments
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Towards identifying distinguishable tactons for use with mobile devices
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
The key role of touch in non-visual mobile interaction
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Ability-Based Design: Concept, Principles and Examples
ACM Transactions on Accessible Computing (TACCESS)
Usable gestures for blind people: understanding preference and performance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Blind people and mobile keypads: accounting for individual differences
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
BrailleType: unleashing braille over touch screen mobile phones
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
No-look notes: accessible eyes-free multi-touch text entry
Pervasive'10 Proceedings of the 8th international conference on Pervasive Computing
Haptic reference cues to support the exploration of touchscreen mobile devices by blind users
Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI
Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Proceedings of the 12th Brazilian Symposium on Human Factors in Computing Systems
Hi-index | 0.00 |
In this paper we analyze the interaction of blind users with Apple touchscreen devices iPad, iPhone and iPod touch, accessible to the visually-impaired thanks to their pre-installed VoiceOver screen reader or magnifier. Specifically, we focus on the gestures offered by VoiceOver to simplify interaction for blind users. A usability inspection of the devices' user interfaces has been performed and integrated with user feedback collected via an online survey taken by 55 totally blind users. Results confirm that VoiceOver makes the Apple devices basically accessible to blind users, but there are still some issues related to usability. Users normally believe that accessibility integrated with VoiceOver is an important innovation, but some operations, such as the writing of long text, take too long or are not comfortable. Results suggest that a multimodal approach on mobile touchscreen devices does not offer a simple and satisfactory interaction paradigm for all and it deserves further investigation. Three possible solutions for improving user interface interaction and offering a simpler and more comfortable experience for blind individuals were proposed to the survey participants, gathering their positive feedback.