Virtual View Synthesis from Uncalibrated Stereo Cameras

  • Authors:
  • M. A. Akhloufi;V. Polotski;P. Cohen

  • Affiliations:
  • Ecole Polytechnique de Montreal;Ecole Polytechnique de Montreal;Ecole Polytechnique de Montreal

  • Venue:
  • ICMCS '99 Proceedings of the 1999 IEEE International Conference on Multimedia Computing and Systems - Volume 02
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new approach to the synthesis of a novel view from two images captured by a non-calibrated stereo system. Here view synthesis employs epipolar constraints associated with a two cameras configuration. A fundamental matrix is used to obtain features in the synthesized view via reprojection of corresponding features in the source images. Unlike classical methods which are based on inferring three dimensional structure of the scene or use dense correspondence between the source images to produce the new synthesized image, this method requires only sparse correspondence between source image features. Perspective image warping techniques then render the remaining image points via interpolation. The approach permits interactive view synthesis in: immersive telepresence systems, realistic virtual worlds and overlay of objects at different positions on live video of dynamic scenes for augmented reality display systems. Method efficiency is illustrated with examples of synthetic and real scenes.