Menu selection using auditory interface

  • Authors:
  • Koichi Hirota;Yosuke Watanabe;Yasushi Ikei

  • Affiliations:
  • Graduate School of Frontier Sciences, University of Tokyo, Kashiwa, Chiba;Faculty of System Design, Tokyo Metropolitan University, Hino, Tokyo;Faculty of System Design, Tokyo Metropolitan University, Hino, Tokyo

  • Venue:
  • HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

An approach to auditory interaction with wearable computer is investigated. Menu selection and keyboard input interfaces are experimentally implemented by integrating pointing interface using motion sensors with auditory localization system based on HRTF. Performance of users, or the efficiency of interaction, is evaluated through experiments using subjects. The average time for selecting a menu item was approximately 5-9 seconds depending on the geometric configuration of the menu, and average key input performance was approximately 6 seconds per a character. The result did not support our expectation that auditory localization of menu items will be a helpful cue for accurate pointing.