Efficient reconstruction of nonrigid shape and motion from real-time 3D scanner data

  • Authors:
  • Michael Wand;Bart Adams;Maksim Ovsjanikov;Alexander Berner;Martin Bokeloh;Philipp Jenke;Leonidas Guibas;Hans-Peter Seidel;Andreas Schilling

  • Affiliations:
  • Saarland University and Max Planck Institut Informatik Saarbrücken, Saarbrücken, Germany;Stanford University and Katholieke Universiteit Leuven;Stanford University;University of Tübingen, WSI/GRIS;University of Tübingen, WSI/GRIS;University of Tübingen, WSI/GRIS;Stanford University;Max Planck Institut Informatik Saarbrücken, Saarbrücken, Germany;University of Tübingen, WSI/GRIS

  • Venue:
  • ACM Transactions on Graphics (TOG)
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a new technique for reconstructing a single shape and its nonrigid motion from 3D scanning data. Our algorithm takes a set of time-varying unstructured sample points that capture partial views of a deforming object as input and reconstructs a single shape and a deformation field that fit the data. This representation yields dense correspondences for the whole sequence, as well as a completed 3D shape in every frame. In addition, the algorithm automatically removes spatial and temporal noise artifacts and outliers from the raw input data. Unlike previous methods, the algorithm does not require any shape template but computes a fitting shape automatically from the input data. Our reconstruction framework is based upon a novel topology-aware adaptive subspace deformation technique that allows handling long sequences with complex geometry efficiently. The algorithm accesses data in multiple sequential passes, so that long sequences can be streamed from hard disk, not being limited by main memory. We apply the technique to several benchmark datasets, significantly increasing the complexity of the data that can be handled efficiently in comparison to previous work.