Novel view generation for a real-time captured video object

  • Authors:
  • Hua Chen;Peter F. Elzer

  • Affiliations:
  • Technical University of Clausthal (TUC), Clausthal-Zellerfeld, Germany;Technical University of Clausthal (TUC), Clausthal-Zellerfeld, Germany

  • Venue:
  • Proceedings of the ACM symposium on Virtual reality software and technology
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a novel method of real-time novel view synthesis for an object that is observed by two fixed video cameras in a natural environment. Without reconstructing the 3D model of the object, the view of the object corresponding to a virtual camera that moves between the two real cameras is generated by applying a view morphing process to the object region in the image pair captured in real-time. Using the captured live video frames, the proposed method can not only generate realistic novel views in real-time, but also ensure a natural and smooth image transition between the two cameras. It can be used in a variety of Mixed Reality (MR) applications to integrate live video objects into virtual environments. Experimental results verify the validity of the proposed approach.