Integrating the physical environment into mobile remote collaboration

  • Authors:
  • Steffen Gauglitz;Cha Lee;Matthew Turk;Tobias Höllerer

  • Affiliations:
  • University of California, Santa Barbara, Santa Barbara, California, United States;University of California, Santa Barbara, Santa Barbara, California, United States;University of California, Santa Barbara, Santa Barbara, California, United States;University of California, Santa Barbara, Santa Barbara, California, United States

  • Venue:
  • MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a framework and prototype implementation for unobtrusive mobile remote collaboration on tasks that involve the physical environment. Our system uses the Augmented Reality paradigm and model-free, markerless visual tracking to facilitate decoupled, live updated views of the environment and world-stabilized annotations while supporting a moving camera and unknown, unprepared environments. In order to evaluate our concept and prototype, we conducted a user study with 48 participants in which a remote expert instructed a local user to operate a mock-up airplane cockpit. Users performed significantly better with our prototype (40.8 tasks completed on average) as well as with static annotations (37.3) than without annotations (28.9). 79% of the users preferred our prototype despite noticeably imperfect tracking.