Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
Wearable interfaces for orientation and wayfinding
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
Drishti: An Integrated Navigation System for Visually Impaired and Disabled
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
An Experimental Investigation into Wayfinding Directions for Visually Impaired People
Personal and Ubiquitous Computing
Experiences from the design of a ubiquitous computing system for the blind
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Evaluation of spatial displays for navigation without sight
ACM Transactions on Applied Perception (TAP)
Interactive 3D sonification for the exploration of city maps
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Accessible contextual information for urban orientation
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
RouteCheckr: personalized multicriteria routing for mobility impaired pedestrians
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
SWAN: System for Wearable Audio Navigation
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Haptic handheld wayfinder with pseudo-attraction force for pedestrians with visual impairments
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Navigation for the blind through audio-based virtual environments
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Timbremap: enabling the visually-impaired to use maps on touch-enabled devices
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
VizWiz: nearly real-time answers to visual questions
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Supporting visually impaired navigation: a needs-finding study
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Access overlays: improving non-visual access to large touch screens for blind users
Proceedings of the 24th annual ACM symposium on User interface software and technology
Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
CrossingGuard: exploring information content in navigation aids for visually impaired pedestrians
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Learning non-visual graphical information using a touch-based vibro-audio interface
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Combining crowdsourcing and google street view to identify street-level accessibility problems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Listen to it yourself!: evaluating usability of what's around me? for the blind
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Sighted individuals often develop significant knowledge about their environment through what they can visually observe. In contrast, individuals who are visually impaired mostly acquire such knowledge about their environment through information that is explicitly related to them. This paper examines the practices that visually impaired individuals use to learn about their environments and the associated challenges. In the first of our two studies, we uncover four types of information needed to master and navigate the environment. We detail how individuals' context impacts their ability to learn this information, and outline requirements for independent spatial learning. In a second study, we explore how individuals learn about places and activities in their environment. Our findings show that users not only learn information to satisfy their immediate needs, but also to enable future opportunities -- something existing technologies do not fully support. From these findings, we discuss future research and design opportunities to assist the visually impaired in independent spatial learning.