Feature points based facial animation retargeting

  • Authors:
  • Ludovic Dutreve;Alexandre Meyer;Saïda Bouakaz

  • Affiliations:
  • Université de Lyon, France;Université de Lyon, France;Université de Lyon, France

  • Venue:
  • Proceedings of the 2008 ACM symposium on Virtual reality software and technology
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a method for transferring facial animation in real-time. The source animation may be an existing 3D animation or 2D data providing by a video tracker or a motion capture system. Based on two sets of feature points manually selected on the source and target faces (the only manual work required), a RBF network is trained and provides a geometric transformation between the two faces. At each frame, the RBF transformation is applied on the new feature points positions of the source face, resulting in new positions for target feature points according with the expression of the source face and the morphology of the target face. According to their displacements along time, we deform the target mesh on the GPU with the linear blend skinning (LBS) method. In order to make our approach attractive to novice user, we propose a procedural technique to automatically rig the target face by generating vertices weights for the skinning deformation. To summarize, our method provides interactive expression transfer with a minimal human intervention during setup and accepts various kind of animation sources.