Lip-Sync in Human Face Animation Based on Video Analysis and Spline Models

  • Authors:
  • Sy-sen Tang;Alan Wee-Chung Liew;Hong Yan

  • Affiliations:
  • -;-;-

  • Venue:
  • MMM '04 Proceedings of the 10th International Multimedia Modelling Conference
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human facial animation is an interesting and difficultproblem in computer graphics. In this paper, a novel B-spline(NURBS) muscle system is proposed to simulate a3D facial expression and talking animation. The systemgets the lip shape parameters from the video, whichcaptures a real person's lip movement, to control theproper muscles to form different phonemes. The musclesare constructed by the non-uniform rational B-splinecurves, which are based on anatomical knowledge. Byusing different number of control points on the muscles,more detailed facial expression and mouth shapes can besimulated. We demonstrate the flexibility of our model bysimulating different emotions and lip-sync to a video witha talking head using the automatically extracted lipparameters.