A Map-Based System Using Speech and 3D Gestures for Pervasive Computing

  • Authors:
  • Andrea Corradini;Richard M. Wesson;Philip R. Cohen

  • Affiliations:
  • Oregon Health & Science University;Oregon Health & Science University;Oregon Health & Science University

  • Venue:
  • ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe an augmentation of Quickset, a multimodal voice/pen system that allows users to create and control map-based, collaborative, interactive simulations. In this paper, we report on our extension of the graphical pen input mode from stylus/mouse to 3D hand movements. To do this, the map is projected onto a virtual plane in space, specified by the operator beforethe start of the interactive session. We then use our geometric model to compute the intersection of hand movements with the virtual plane, translating these into map coordinates on the appropriate system. The goal of this research is the creation of a body-centered, multimodal architecture employing both speech and 3D hand gestures, which seamlessly and unobtrusively supports distributed interaction. The augmented system, built on top of an existing architecture, also provides an improved visualization, management and awareness of a shared understanding. Potential applications of this work include tele-medicine, battlefield management and any kind of collaborative decision-making during which users may wish to be mobile.