Robust Face Tracking via Collaboration of Generic and Specific Models

  • Authors:
  • Peng Wang;Qiang Ji

  • Affiliations:
  • Siemens Corp. Res., Princeton, NJ;-

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

Significant appearance changes of objects under different orientations could cause loss of tracking, ldquodrifting.rdquo In this paper, we present a collaborative tracking framework to robustly track faces under large pose and expression changes and to learn their appearance models online. The collaborative tracking framework probabilistically combines measurements from an offline-trained generic face model with measurements from online-learned specific face appearance models in a dynamic Bayesian network. In this framework, generic face models provide the knowledge of the whole face class, while specific face models provide information on individual faces being tracked. Their combination, therefore, provides robust measurements for multiview face tracking. We introduce a mixture of probabilistic principal component analysis (MPPCA) model to represent the appearance of a specific face under multiple views, and we also present an online EM algorithm to incrementally update the MPPCA model using tracking results. Experimental results demonstrate that the collaborative tracking and online learning methods can handle large pose changes and are robust to distractions from the background.