EDFCES: a new example-driven 3D face construction and editing system

  • Authors:
  • Yu Zhang;Eric Sung

  • Affiliations:
  • Department of Computer and Information Science, University of Pennsylvania, PA;School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore

  • Venue:
  • Machine Graphics & Vision International Journal
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an automatic runtime system for generating varied, realistic face models by synthesizing a global face shape and local facial features according to intuitive, high-level control parameters. Our method takes as examples 3D face scans in order to exploit the parameter-to-geometry correlations present in real faces. In order to establish the correspondences among the scanned models, we use a three-step model fitting approach to conform a generic head mesh onto each scanned model. We transform the obtained data sets of global face shapes and local feature shapes into vector space representations by applying a principal component analysis (PCA). We compute a set of face anthropometric measurements to parameterize the exemplary shapes in the measurement spaces. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions designed to generate the desired face shape by taking anthropometric parameters as input. At runtime, the interpolation functions are evaluated for the input parameter values to produce new face geometries at an interactive rate. The correspondence among all exemplary face textures is obtained by parameterizing the 3D generic mesh over a 2D image domain. The new feature texture with the desired attributes is synthesized by interpolating the example textures. The resulting system is intuitive to control and fine-grained. We demonstrate our method by applying different parameters to generate a wide range of face models.