Hybrid Inertial and Vision Tracking for Augmented Reality Registration

  • Authors:
  • Suya You;Ulrich Neumann;Ronald Azuma

  • Affiliations:
  • -;-;-

  • Venue:
  • VR '99 Proceedings of the IEEE Virtual Reality
  • Year:
  • 1999

Quantified Score

Hi-index 0.02

Visualization

Abstract

The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation, limiting their use to prepared areas that are relatively free of natural or man-made interference sources. Vision-based systems can use passive landmarks, but they are more computationally demanding and often exhibit erroneous behavior due to occlusion or numerical instability. Inertial sensors are completely passive, requiring no external devices or targets, however, the drift rates in portable strapdown configurations are too great for practical use. In this paper, we present a hybrid approach to AR tracking that integrates inertial and vision-based technologies. We exploit the complementary nature of the two technologies to compensate for the weaknesses in each component. Analysis and experimental results demonstrate this system's effectiveness.