Personal guidance system for the visually impaired
Assets '94 Proceedings of the first annual ACM conference on Assistive technologies
Wearable interfaces for orientation and wayfinding
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
The audio abacus: representing numerical values with nonspeech sound for the visually impaired
Assets '04 Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
Talking braille: a wireless ubiquitous computing network for orientation and wayfinding
Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility
interactions - Gadgets '06
Autonomous navigation through the city for the blind
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Hi-index | 0.00 |
Sighted people use vision to quickly access a rich amount of information about their environment. Blind users are deprived of this information, which compromises their abilities both to understand their surroundings and to navigate within them. Alternative sensory information, such as sound or touch, is substituted for the missing stimulus [7]. This information supports navigation in the immediate vicinity, e.g., the location of obstacles and hazards, but does not provide the location of distant unique objects which sighted people often use as navigating landmarks. The opportunity exists to electronically augment a blind user's environment with information about distant landmarks, e.g., information that would allow a blind person to turn in a circle, listen to distant landmarks, and then proceed in a direction guided by a chosen landmark.