Automatic 3D facial model and texture reconstruction from range scans

  • Authors:
  • Guofu Xiang;Xiangyang Ju;Patrik O'B. Holt

  • Affiliations:
  • Cognitive Engineering Research Group, School of Computing, The Robert Gordon University, Aberdeen, UK;Cognitive Engineering Research Group, School of Computing, The Robert Gordon University, Aberdeen, UK;Cognitive Engineering Research Group, School of Computing, The Robert Gordon University, Aberdeen, UK

  • Venue:
  • AMDO'10 Proceedings of the 6th international conference on Articulated motion and deformable objects
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a fully automatic approach to fitting a generic facial model to detailed range scans of human faces to reconstruct 3D facial models and textures with no manual intervention (such as specifying landmarks). A Scaling Iterative Closest Points (SICP) algorithm is introduced to compute the optimal rigid registrations between the generic model and the range scans with different sizes. And then a new template-fitting method, formulated in an optmization framework of minimizing the physically based elastic energy derived from thin shells, faithfully reconstructs the surfaces and the textures from the range scans and yields dense point correspondences across the reconstructed facial models. Finally, we demonstrate a facial expression transfer method to clone facial expressions from the generic model onto the reconstructed facial models by using the deformation transfer technique.