Interacting with paper on the DigitalDesk
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Bonfire: a nomadic system for hybrid laptop-tabletop interaction
Proceedings of the 22nd annual ACM symposium on User interface software and technology
VizWiz: nearly real-time answers to visual questions
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
EasySnap: real-time audio feedback for blind photography
UIST '10 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology
Access overlays: improving non-visual access to large touch screens for blind users
Proceedings of the 24th annual ACM symposium on User interface software and technology
EyeRing: a finger-worn assistant
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Hi-index | 0.01 |
Gesture-based touch screen user interfaces, when designed to be accessible to blind users, can be an effective mode of interaction for those users. However, current accessible touch screen interaction techniques suffer from one serious limitation: they are only usable on devices that have been explicitly designed to support them. Access Lens is a new interaction method that uses computer vision-based gesture tracking to enable blind people to use accessible gestures on paper documents and other physical objects, such as product packages, device screens, and home appliances. This paper describes the development of Access Lens hardware and software, the iterative design of Access Lens in collaboration with blind computer users, and opportunities for future development.