Estimation of FAPs and intensities of AUs based on real-time face tracking

  • Authors:
  • Bingqing Qu;Sathish Pammi;Radoslaw Niewiadomski;Gérard Chollet

  • Affiliations:
  • Télécom ParisTech, Paris, France;Télécom ParisTech, Paris, France;Télécom ParisTech, Paris, France;Télécom ParisTech, Paris, France

  • Venue:
  • Proceedings of the 3rd Symposium on Facial Analysis and Animation
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) -- The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior.