ONTRACK: Dynamically adapting music playback to support navigation
Personal and Ubiquitous Computing
Audio Bubbles: Employing Non-speech Audio to Support Tourist Wayfinding
HAID '09 Proceedings of the 4th International Conference on Haptic and Audio Interaction Design
Sweep-Shake: finding digital resources in physical environments
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Scanning angles for directional pointing
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
No-look notes: accessible eyes-free multi-touch text entry
Pervasive'10 Proceedings of the 8th international conference on Pervasive Computing
Navigating the world and learning to like it: mobility training through a pervasive game
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Guiding tourists through haptic interaction: vibration feedback in the lund time machine
EuroHaptics'12 Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part II
Using sound to enhance users' experiences of mobile applications
Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound
A real-world study of an audio-tactile tourist guide
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Dude, where's my car?: in-situ evaluation of a tactile car finder
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Navigation by pointing to GPS locations
Personal and Ubiquitous Computing
Testing two tools for multimodal navigation
Advances in Human-Computer Interaction
Haptic reference cues to support the exploration of touchscreen mobile devices by blind users
Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI
Haptic target acquisition to enable spatial gestures in nonvisual displays
Proceedings of Graphics Interface 2013
Audio stickies: visually-guided spatial audio annotations on a mobile augmented reality platform
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.00 |
People who have visual impairments may have difficulties navigating freely and without personal assistance, and some are even afraid to go out alone. Current navigation devices with non-visual feedback are quite expensive, few, and are in general focused on routing and target finding. We have developed a test prototype application running on the Android platform in which a user may scan for map information using the mobile phone as a pointing device to orient herself and to choose targets for navigation and be guided to them. It has previously been shown in proof of concept studies that scanning and pointing to get information about different locations, or to use it to be guided to a point, can be useful. In the present study we describe the design of PointNav, a prototype navigational application, and report initial results from a recent test with visually impaired and sighted users.