Using time-of-flight range data for occlusion handling in augmented reality

  • Authors:
  • Jan Fischer;Benjamin Huhle;Andreas Schilling

  • Affiliations:
  • Island Graphics Group, University of Victoria, Canada;WSI/GRIS, University of Tübingen, Germany;WSI/GRIS, University of Tübingen, Germany and Stanford University

  • Venue:
  • EGVE'07 Proceedings of the 13th Eurographics conference on Virtual Environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the main problems of monoscopic video see-through augmented reality (AR) is the lack of reliable depth information. This makes it difficult to correctly represent complex spatial interactions between real and virtual objects, e.g., when rendering shadows. The most obvious graphical artifact is the incorrect display of the occlusion of virtual models by real objects. Since the graphical models are rendered opaquely over the camera image, they always appear to occlude all objects in the real environment, regardless of the actual spatial relationship. In this paper, we propose to utilize a new type of hardware in order to solve some of the basic challenges of AR rendering. We introduce a depth-of-flight range sensor into AR, which produces a 2D map of the distances to real objects in the environment. The distance map is registered with high resolution color images delivered by a digital video camera. When displaying the virtual models in AR, the distance map is used in order to decide whether the camera image or the virtual object is visible at any position. This way, the occlusion of virtual models by real objects can be correctly represented. Preliminary results obtained with our approach show that a useful occlusion handling based on time-of-flight range data is possible.