Fast annotation and modeling with a single-point laser range finder

  • Authors:
  • Jason Wither;Chris Coffin;Jonathan Ventura;Tobias Hollerer

  • Affiliations:
  • Department of Computer Science, University of California, Santa Barbara, USA;Department of Computer Science, University of California, Santa Barbara, USA;Department of Computer Science, University of California, Santa Barbara, USA;Department of Computer Science, University of California, Santa Barbara, USA

  • Venue:
  • ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents methodology for integrating a small, single-point laser range finder into a wearable augmented reality system. We first present a way of creating object-aligned annotations with very little user effort. Second, we describe techniques to segment and pop-up foreground objects. Finally, we introduce a method using the laser range finder to incrementally build 3D panoramas from a fixed observer’s location. To build a 3D panorama semi-automatically, we track the system’s orientation and use the sparse range data acquired as the user looks around in conjunction with real-time image processing to construct geometry around the user’s position. Using full 3D panoramic geometry, it is possible for new virtual objects to be placed in the scene with proper lighting and occlusion by real world objects, which increases the expressivity of the AR experience.