Interactive image-based rendering using feature globalization

  • Authors:
  • Daniel G. Aliaga;Dimah Yanovsky;Thomas Funkhouser;Ingrid Carlbom

  • Affiliations:
  • Lucent Bell Labs;Harvard University;Princeton University;Lucent Bell Labs

  • Venue:
  • I3D '03 Proceedings of the 2003 symposium on Interactive 3D graphics
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Image-based rendering (IBR) systems enable virtual walkthroughs of photorealistic environments by warping and combining reference images to novel viewpoints under interactive user control. A significant challenge in such systems is to automatically compute image correspondences that enable accurate image warping.In this paper, we describe a new algorithm for computing a globally consistent set of image feature correspondences across a wide range of viewpoints suitable for IBR walkthroughs. We first detect point features in a dense set of omnidirectional images captured on an eye-height plane. Then, we track these features from image to image, identifying potential correspondences when two features track to the same position in the same image. Among the potential correspondences, we select the maximal consistent set using a greedy graph-labeling algorithm.A key feature of our approach is that it exploits the multiple paths that can be followed between images in order to increase the number of feature correspondences between distant images. We demonstrate the benefits of this approach in a real-time IBR walkthrough system where novel images are reconstructed as the user moves interactively.