Kinect-based facial animation

  • Authors:
  • Thibaut Weise;Sofien Bouaziz;Hao Li;Mark Pauly

  • Affiliations:
  • EPFL;EPFL;Columbia University;EPFL

  • Venue:
  • SIGGRAPH Asia 2011 Emerging Technologies
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this demo we present our system for performance-based character animation that enables any user to control the facial expressions of a digital avatar in realtime. Compared to existing technologies, our system is easy to deploy and does not require any face markers, intrusive lighting, or complex scanning hardware. Instead, the user is recorded in a natural environment using the non-intrusive, commercially available Microsoft Kinect 3D sensor. Since high noise levels in the acquired data prevent conventional tracking methods to work well, we developed a method to combine a database of existing animations with facial tracking to generate compelling animations. Realistic facial tracking facilitates a range of new applications, e.g. in digital gameplay, telepresence or social interactions.