Comprehending and transferring facial expressions based on statistical shape and texture models

  • Authors:
  • Pengcheng Xi;Won-Sook Lee;Gustavo Frederico;Chris Joslin;Lihong Zhou

  • Affiliations:
  • School of Information Technology and Engineering, University of Ottawa, ON, Canada;School of Information Technology and Engineering, University of Ottawa, ON, Canada;School of Information Technology and Engineering, University of Ottawa, ON, Canada;School of Information Technology, Carleton University, Ottawa, ON, Canada;Information Security Center, Southeast University, Nanjing, China

  • Venue:
  • CGI'06 Proceedings of the 24th international conference on Advances in Computer Graphics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce an efficient approach for representing a human face using a limited number of images. This compact representation allows for meaningful manipulation of the face. Principal Components Analysis (PCA) utilized in our research makes possible the separation of facial features so as to build statistical shape and texture models. Thus changing the model parameters can create images with different expressions and poses. By presenting newly created faces for reviewers' marking in terms of intensities on masculinity, friendliness and attractiveness, we analyze relations between the parameters and intensities. With feature selections, we sort those parameters by their importance in deciding the three aforesaid aspects. Thus we are able to control the models and transform a new face image to be a naturally masculine, friendly or attractive one. In the PCA-based feature space, we can successfully transfer expressions from one subject onto a novel person's face.