Mapping GUIs to auditory interfaces
UIST '92 Proceedings of the 5th annual ACM symposium on User interface software and technology
A system for three-dimensional acoustic "visualization" in a virtual environment workstation
VIS '90 Proceedings of the 1st conference on Visualization '90
vCocktail: Multiplexed-voice Menu Presentation Method for Wearable Computers
VR '06 Proceedings of the IEEE conference on Virtual Reality
Hi-index | 0.00 |
An approach to auditory interaction with wearable computer is investigated. Menu selection and keyboard input interfaces are experimentally implemented by integrating pointing interface using motion sensors with auditory localization system based on HRTF. Performance of users, or the efficiency of interaction, is evaluated through experiments using subjects. The average time for selecting a menu item was approximately 5-9 seconds depending on the geometric configuration of the menu, and average key input performance was approximately 6 seconds per a character. The result did not support our expectation that auditory localization of menu items will be a helpful cue for accurate pointing.