Novel view image synthesis based on photo-consistent 3D model deformation

  • Authors:
  • Meng-Hsuan Chia;Chien-Hsing He;Huei-Yung Lin

  • Affiliations:
  • Department of Electrical Engineering, National Chung Cheng University, Chiayi 621, Taiwan;Department of Electrical Engineering, National Chung Cheng University, Chiayi 621, Taiwan;Department of Electrical Engineering, National Chung Cheng University, Chiayi 621, Taiwan

  • Venue:
  • International Journal of Computational Science and Engineering
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a system is designed for improving the quality of novel view synthesis. To make the virtual view synthesis look better, the basic idea is to perform model refinement with important camera information of the novel view. In this system, we first use the reconstructed visual hull from shape from silhouette and then refine this 3D model based on the view dependency. The 3D points of the model are classified into the outline points and the non-outline points according to the virtual viewpoint. To refine the model, both of the outline points and the non-outline points are used to move the 3D points iteratively by minimising energy function until convergence. The key energy is the photo-consistency energy with additional smoothness energy and contour/visual hull energy. The latter two energy terms can avoid the local minimum when calculating the photo-consistency energy. Finally, we render the novel view image using view-dependent image synthesis by blending the pixel values from reference cameras near the virtual camera.