Novel stereoscopic view generation by image-based rendering coordinated with depth information

  • Authors:
  • Maiya Hori;Masayuki Kanbara;Naokazu Yokoya

  • Affiliations:
  • Graduate School of Information Science, Nara Institute of Science and Technology, Ikoma, Nara, Japan;Graduate School of Information Science, Nara Institute of Science and Technology, Ikoma, Nara, Japan;Graduate School of Information Science, Nara Institute of Science and Technology, Ikoma, Nara, Japan

  • Venue:
  • SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a method of stereoscopic view generation by image-based rendering in wide outdoor environments. The stereoscopic view can be generated from an omnidirectional image sequence by a light field rendering approach which generates a novel view image from a set of images. The conventional methods of novel view generation have a problem such that the generated image is distorted because the image is composed of parts of several omnidirectional images captured at different points. To overcome this problem, we have to consider the distances between the novel viewpoint and observed real objects in the rendering process. In the proposed method, in order to reduce the image distortion, stereoscopic images are generated considering depth values estimated by dynamic programming (DP) matching using the images that are observed from different points and contain the same ray information in the real world. In experiments, stereoscopic images in wide outdoor environments are generated and displayed.