Non-visual-cueing-based sensing and understanding of nearby entities in aided navigation.

  • Authors:
  • Juan Diego Gomez;Guido Bologna;Thierry Pun

  • Affiliations:
  • University of Geneva, Geneva, Switzerland;University of Geneva, Geneva, Switzerland;University of Geneva, Geneva, Switzerland

  • Venue:
  • Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Exploring unfamiliar environments is a challenging task in which additionally, unsighted individuals frequently fail to gain perception of obstacles and make serendipitous discoveries. This is because the mental depiction of the context is drastically lessened due to the absence of visual information. It is still not clear in neuroscience, whether stimuli elicited by visual cueing can be replicated by other senses (cross-model transfer). In the practice, however, everyone recognizes a key, whether it is felt in a pocket or seen on a table. We present a context-aware aid system for the blind that merges three levels of assistance enhancing the intelligibility of the nearby entities: an exploration module to help gain awareness of the surrounding context, an alerting method for warning the user when a stumble is likely, and, finally, a recognition engine that retrieves natural targets previously learned. Practical experiences with our system show that in the absence of visual cueing, the audio and haptic trajectory playback coupled with computer-vision methods is a promising approach to depict dynamic information of the immediate environment.