Reconstructing animated meshes from time-varying point clouds

  • Authors:
  • Jochen Süßmuth;Marco Winter;Günther Greiner

  • Affiliations:
  • University Erlangen-Nuremberg, Germany;University Erlangen-Nuremberg, Germany;University Erlangen-Nuremberg, Germany

  • Venue:
  • SGP '08 Proceedings of the Symposium on Geometry Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we describe a novel approach for the reconstruction of animated meshes from a series of time-deforming point clouds. Given a set of unordered point clouds that have been captured by a fast 3-D scanner, our algorithm is able to compute coherent meshes which approximate the input data at arbitrary time instances. Our method is based on the computation of an implicit function in R4 that approximates the time-space surface of the time-varying point cloud. We then use the four-dimensional implicit function to reconstruct a polygonal model for the first time-step. By sliding this template mesh along the time-space surface in an as-rigid-as-possible manner, we obtain reconstructions for further time-steps which have the same connectivity as the previously extracted mesh while recovering rigid motion exactly. The resulting animated meshes allow accurate motion tracking of arbitrary points and are well suited for animation compression. We demonstrate the qualities of the proposed method by applying it to several data sets acquired by real-time 3-D scanners.