Analysis of colour space transforms for person independent AAMs

  • Authors:
  • Tadas Baltrušaitis;Peter Robinson

  • Affiliations:
  • University of Cambridge;University of Cambridge

  • Venue:
  • Proceedings of the SSPNET 2nd International Symposium on Facial Analysis and Animation
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Statistical models of non-rigid deformable objects such as Active Appearance Models (AAM) are a popular means of both registration, tracking and synthesis of faces. Due to rapid fitting and good accuracy they are used extensively for facial expression tracking and analysis. A problem facing AAM based face tracking, is their inability to generalise well to unseen faces especially from unseen databases. One way to overcome this problem is through the combined use of various available databases, as some of them capture lighting variations, others pose or expression. In addition, this allows us to capture more variation in ethnicity, gender and age. Use of multiple databases gives us a better opportunity to create person, expression, and pose independent models. A problem arises because of the heterogeneity of the available databases due to use of different lenses, exposure times, external lightning or shadows etc. We describe an approach that leads to improved convergence of AAM fitting at close range when training a model on two different databases. In addition, this approach offers a substantial improvement when fitting images from unseen databases.