EyeSound: single-modal mobile navigation using directionally annotated music

  • Authors:
  • Shingo Yamano;Takamitsu Hamajo;Shunsuke Takahashi;Keita Higuchi

  • Affiliations:
  • Kanazawa Institute of Technology, Ishikawa, Japan;Kanazawa Institute of Technology, Ishikawa, Japan;Kanazawa Institute of Technology, Ishikawa, Japan;The University of Tokyo, Tokyo, Japan

  • Venue:
  • AH '12 Proceedings of the 3rd Augmented Human International Conference
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a mobile navigation system that uses only auditory information, i.e., music, to guide the user. The sophistication of mobile devices has introduced the use of contextual information in mobile navigation, such as the location and the direction of motion of a pedestrian. Typically in such systems, a map on the screen of the mobile device is required to show the current position and the destination. However, this restricts the movements of the pedestrian, because users must hold the device to observe the screen. We have, therefore, implemented a mobile navigation system that guides the pedestrian in a non-restricting manner by adding direction information to music. By measuring the resolution of the direction that the user can perceive, the phase of the musical sound is changed to guide the pedestrian. Using this system, we have verified the effectiveness of the proposed mobile navigation system.