Multimodal Interaction with a Wearable Augmented Reality System

  • Authors:
  • Mathias Koelsch;Ryan Bane;Tobias Hoellerer;Matthew Turk

  • Affiliations:
  • Naval Postgraduate School;Microsoft Corporation;University of California, Santa Barbara;University of California, Santa Barbara

  • Venue:
  • IEEE Computer Graphics and Applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Wearable computers and their novel applications demand more context-specific user interfaces than traditional desktop paradigms can offer. This article describes a multimodal interface and explains how it enhances a mobile user's situational awareness and how it provides new functionality. This mobile, augmented-reality system visualizes otherwise invisible information encountered in urban environments. A versatile filtering tool allows interactive display of occluded infrastructure and of dense data distributions, such as room temperature or wireless network strength, with applications for building maintenance, emergency response, and reconnaissance missions. To control this complex application functionality in the real world, the authors combine multiple input modalities--vision-based hand gesture recognition, a 1D tool, and speech recognition--with three late integration styles to provide intuitive and effective input means. The system is demonstrated in a realistic indoor and outdoor task environment, and preliminary user experiences are described. The authors postulate that novel interaction metaphors must be developed together with user interfaces that are capable of controlling them.