Principal component analysis for facial animation

  • Authors:
  • K. Goudeaux;Tsuhan Chen;Shyue-Wu Wang;Jen-Duo Liu

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Carnegie Mellon Univ., Pittsburgh, PA, USA;-;-;-

  • Venue:
  • ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 2001. on IEEE International Conference - Volume 03
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a technique for animating a three-dimensional face model through the application of principal component analysis (PCA). Using PCA has several advantages over traditional approaches to facial animation because it reduces the number of parameters needed to describe a face and confines the facial motion to a valid space to prevent unnatural contortions. First, real data is optically captured in real time from a human subject using infrared cameras and reflective trackers. This data is analyzed to find a mean face and a set of eigenvectors and eigenvalues that are used to perturb the mean face within the range described by the captured data. The result is a set of vectors that can be linearly combined and interpolated to represent different facial expressions and animations. We also show that it is possible to map the eigenvectors of one face onto another face or to change the eigenvectors to describe new motion.