Face/Off: live facial puppetry

  • Authors:
  • Thibaut Weise;Hao Li;Luc Van Gool;Mark Pauly

  • Affiliations:
  • ETH Zurich;ETH Zurich;ETH Zurich;ETH Zurich

  • Venue:
  • Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a complete integrated system for live facial puppetry that enables high-resolution real-time facial expression tracking with transfer to another person's face. The system utilizes a real-time structured light scanner that provides dense 3D data and texture. A generic template mesh, fitted to a rigid reconstruction of the actor's face, is tracked offline in a training stage through a set of expression sequences. These sequences are used to build a person-specific linear face model that is subsequently used for online face tracking and expression transfer. Even with just a single rigid pose of the target face, convincing real-time facial animations are achievable. The actor becomes a puppeteer with complete and accurate control over a digital face.