AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface
Personal and Ubiquitous Computing
A Survey of Context-Aware Mobile Computing Research
A Survey of Context-Aware Mobile Computing Research
GpsTunes: controlling navigation via audio feedback
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
ONTRACK: Dynamically adapting music playback to support navigation
Personal and Ubiquitous Computing
SWAN: System for Wearable Audio Navigation
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Eyes-free multitasking: the effect of cognitive load on mobile spatial audio interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
HAPMAP: haptic walking navigation system with support by the sense of handrail
ACM SIGGRAPH 2011 Posters
Virtual prototyping of a spatial audio interface for obstacle avoidance using image processing
Proceedings of the 4th Augmented Human International Conference
Hi-index | 0.00 |
In this paper, we propose a mobile navigation system that uses only auditory information, i.e., music, to guide the user. The sophistication of mobile devices has introduced the use of contextual information in mobile navigation, such as the location and the direction of motion of a pedestrian. Typically in such systems, a map on the screen of the mobile device is required to show the current position and the destination. However, this restricts the movements of the pedestrian, because users must hold the device to observe the screen. We have, therefore, implemented a mobile navigation system that guides the pedestrian in a non-restricting manner by adding direction information to music. By measuring the resolution of the direction that the user can perceive, the phase of the musical sound is changed to guide the pedestrian. Using this system, we have verified the effectiveness of the proposed mobile navigation system.