Multimodal 'eyes-free' interaction techniques for wearable devices

  • Authors:
  • Stephen Brewster;Joanna Lumsden;Marek Bell;Malcolm Hall;Stuart Tasker

  • Affiliations:
  • University of Glasgow, U.K.;National Research Council of Canada, Fredericton, New Brunswick, Canada;University of Glasgow, U.K.;University of Glasgow, U.K.;University of Glasgow, U.K.

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.