Novel View Generation from Multiple Omni-Directional Videos

  • Authors:
  • Tomoya Ishikawa;Kazumasa Yamazawa;Naokazu Yokoya

  • Affiliations:
  • Nara Institute of Science and Technology;Nara Institute of Science and Technology;Nara Institute of Science and Technology

  • Venue:
  • ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, generation of novel views from images acquired by multiple cameras has been investigated in the fields of virtual and mixed reality. Most conventional methods need some assumptions about the scene such as a static scene and limited positions of objects. In this paper, we propose a new method for generating novel view images of a dynamic scene with a wide view, which does not depend on the scene. The images acquired from omni-directional cameras are first divided into static regions and dynamic regions. The novel view images are then generated by applying a morphing technique to static regions and by computing visual hulls for dynamic regions in real-time. In experiments, we show that a prototype system can generate novel view images in real-time from live video streams.