Mobile Reality: A PDA-Based Multimodal Framework Synchronizing a Hybrid Tracking Solution with 3D Graphics and Location-Sensitive Speech Interaction

  • Authors:
  • Stuart Goose;Heiko Wanning;Georg Schneider

  • Affiliations:
  • -;-;-

  • Venue:
  • UbiComp '02 Proceedings of the 4th international conference on Ubiquitous Computing
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

A maintenance engineer who talks to pumps and pipes may not seem like the ideal person to entrust with keeping a factory running smoothly, but we hope that our Mobile Reality framework will enable such behavior in the future to be anything but suspicious! Described in this paper is how the Mobile Reality framework, running entirely on a Pocket PC, synchronizes a hybrid tracking solution to offer the user a seamless, location-dependent, mobile multimodal interface. The user interface juxtaposes a three-dimensional graphical view with a context-sensitive speech dialog centered upon objects located in the immediate vicinity of the mobile user. In addition, support for collaboration enables shared VRML browsing with annotation and a full-duplex voice channel.