Construction of Virtual Environment from Video Data with Forward Motion

  • Authors:
  • Xiahua Zhang;Hiroki Takahashi;Masayuki Nakajima

  • Affiliations:
  • -;-;-

  • Venue:
  • AMCP '98 Proceedings of the First International Conference on Advanced Multimedia Content Processing
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

The construction of photo-realistic 3D scenes from video data is an active and competitive area of research in the fields of computer vision, image processing and computer graphics. In this paper we address our recent work in this area. Unlike most methods of 3D scene construction, we consider the generation of virtual environments from video sequences with a video-cam's forward motion. Each frame is decomposed into sub-images, which are registered correspondingly using the Levenberg-Marquardt iterative algorithm to estimate motion parameters. The registered sub-images are correspondingly pasted together to form a pseudo-3D space. By controlling the position and direction, the virtual camera can walk through this virtual space to create novel 2D views to acquire an immersive impression. Even if the virtual camera goes deep into this virtual environment, it can still obtain a novel view while maintaining high resolution.