Mobile Augmented Reality: Robust detection and tracking of annotations for outdoor augmented reality browsing

  • Authors:
  • Tobias Langlotz;Claus Degendorfer;Alessandro Mulloni;Gerhard Schall;Gerhard Reitmayr;Dieter Schmalstieg

  • Affiliations:
  • Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria;Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria;Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria;Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria;Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria;Graz University of Technology, Institute for Computer Graphics and Vision, Inffeldgasse 16, 8010 Graz, Austria

  • Venue:
  • Computers and Graphics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.