Real-time facial animation from live video tracking

  • Authors:
  • Taehyun Rhee;Youngkyoo Hwang;James Dokyoon Kim;Changyeong Kim

  • Affiliations:
  • Samsung Advanced Institute of Technology (SAIT);Samsung Advanced Institute of Technology (SAIT);Samsung Advanced Institute of Technology (SAIT);Samsung Advanced Institute of Technology (SAIT)

  • Venue:
  • SCA '11 Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a complete pipe-line of a practical system for producing real-time facial expressions of a 3D virtual avatar controlled by an actor's live performances. The system handles practical challenges arising from markerless expression captures from a single conventional video camera. For robust tracking, a localized algorithm constrained by belief propagation is applied to the upper face, and an appearance matching technique using a parameterized generic face model is exploited for lower face and head pose tracking. The captured expression features then transferred to high dimensional 3D animation controls using our facial expression space which is a structure-preserving map between two algebraic structures. The transferred animation controls drive facial animation of a 3D avatar while optimizing the smoothness of the face mesh. An example-based face deformation technique produces non-linear local detail deformations on the avatar that are not captured in the movement of the animation controls.