Efficient Active Appearance Model for Real-Time Head and Facial Feature Tracking

  • Authors:
  • F. Dornaika;J. Ahlberg

  • Affiliations:
  • -;-

  • Venue:
  • AMFG '03 Proceedings of the IEEE International Workshop on Analysis and Modeling of Faces and Gestures
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the 3D tracking of pose and animation of thehuman face in monocular image sequences using Active AppearanceModels. The classical appearance-based tracking suffers from twodisadvantages: (i) the estimated out-of-plane motions are not veryaccurate, and (ii)the convergence of the optimization process todesired minima is not guaranteed. In this paper, we aim atdesigning an efficient active appearance model which is able tocope with the above disadvantages by retaining the strengthsoffeature-based and featureless tracking methodologies. For eachframe, the adaptation is split into two consecutive stages. In thefirst stage, the 3D head pose is recovered using robust statisticsand a measure of consistency with a statistical model of a facetexture. In the second stage, the local motion associated with somefacial features is recovered using the concept of the activeappearance model search. Tracking experiments and method comparisondemonstrate the robustness and out-performance of the developedframework.