3DTV view generation using uncalibrated pure rotating and zooming cameras

  • Authors:
  • Songkran Jarusirisawad;Hideo Saito

  • Affiliations:
  • Department of Information and Computer Science, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama 223-8522, Japan;Department of Information and Computer Science, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama 223-8522, Japan

  • Venue:
  • Image Communication
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a novel method for synthesizing free viewpoint video captured by uncalibrated pure rotating and zooming cameras. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective grid space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for weak camera calibration. Trifocal tensors are used to relate non-basis cameras to PGS. Given trifocal tensors in the initial frame, our method automatically computes trifocal tensors in the other frames. Scale invariant feature transform (SIFT) is used for finding corresponding points in a natural scene between the initial frame and the other frames. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video captured by uncalibrated hand-held cameras is successfully synthesized using the proposed method.