Wide-area scene mapping for mobile visual tracking

  • Authors:
  • Jonathan Ventura;Tobias Hollerer

  • Affiliations:
  • University of California, Santa Barbara, USA;University of California, Santa Barbara, USA

  • Venue:
  • ISMAR '12 Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a system for easily preparing arbitrary wide-area environments for subsequent real-time tracking with a handheld device. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared environment. We combine panoramas captured using a handheld omnidirectional camera from several viewpoints to create a point cloud model. After the offline modeling step, live camera pose tracking is initialized by feature point matching, and continuously updated by aligning the point cloud model to the camera image. Given a reconstruction made with less than five minutes of video, we achieve below 25 cm translational error and 0.5 degrees rotational error for over 80% of images tested. In contrast to camera-based simultaneous localization and mapping (SLAM) systems, our methods are suitable for handheld use in large outdoor spaces.