A two-step approach to multiple facial feature tracking: temporal particle filter and spatial belief propagation

  • Authors:
  • Congyong Su;Yueting Zhuang;Li Huang;Fei Wu

  • Affiliations:
  • College of Computer Science, Zhejiang University, Hangzhou, China;College of Computer Science, Zhejiang University, Hangzhou, China;College of Computer Science, Zhejiang University, Hangzhou, China;College of Computer Science, Zhejiang University, Hangzhou, China

  • Venue:
  • FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is challenging to track multiple facial features simultaneously when rich expressions are presented on a face. We propose a two-step solution. In the first step, several independent CONDENSATION-style particle filters are utilized to track each facial feature in temporal domain. Particle filters are very effective for visual tracking problems; however multiple independent trackers ignore the spatial constraints and the natural relationships among facial features. In the second step, we use Bayesian inference - belief propagation to infer each facial feature's contour in spatial domain, in which we learn beforehand the relationships among contours of facial features with the help of a large facial expression database. The experimental results show that our algorithm can robustly track multiple facial features simultaneously, while there are large inter-frame motions with expression change.