FacEMOTE: qualitative parametric modifiers for facial animations

  • Authors:
  • Meeran Byun;Norman I. Badler

  • Affiliations:
  • University of Pennsylvania;University of Pennsylvania

  • Venue:
  • Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a control mechanism for facial expressions by applying a few carefully chosen parametric modifications to pre-existing expression data streams. This approach applies to any facial animation resource expressed in the general MPEG-4 form, whether taken from a library of preset facial expressions, captured from live performance, or entirely manually created. The MPEG-4 Facial Animation Parameters (FAPs) represent a facial expression as a set of parameterized muscle actions, given as intensity of individual muscle movements over time. Our system varies expressions by changing the intensities and scope of sets of MPEG-4 FAPs. It creates variations in "expressiveness" across the face model rather than simply scale, interpolate, or blend facial mesh node positions. The parameters are adapted from the Effort parameters of Laban Movement Analysis (LMA); we developed a mapping from their values onto sets of FAPs. The FacEMOTE parameters thus perturb a base expression to create a wide range of expressions. Such an approach could allow real-time face animations to change underlying speech or facial expression shapes dynamically according to current agent affect or user interaction needs.