Similarity Features for Facial Event Analysis

  • Authors:
  • Peng Yang;Qingshan Liu;Dimitris Metaxas

  • Affiliations:
  • Rutgers University, Piscataway, USA NJ 08854;Rutgers University, Piscataway, USA NJ 08854 and National Laboratory of Pattern Recognition, Chinese Academy of Sciences, Beijing, China 100080;Rutgers University, Piscataway, USA NJ 08854

  • Venue:
  • ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Each facial event will give rise to complex facial appearance variation. In this paper, we propose similarity features to describe the facial appearance for video-based facial event analysis. Inspired by the kernel features, for each sample, we compare it with the reference set with a similarity function, and we take the log-weighted summarization of the similarities as its similarity feature. Due to the distinctness of the apex images of facial events, we use their cluster-centers as the references. In order to capture the temporal dynamics, we use the K-means algorithm to divide the similarity features into several clusters in temporal domain, and each cluster is modeled by a Gaussian distribution. Based on the Gaussian models, we further map the similarity features into dynamic binary patterns to handle the issue of time resolution, which embed the time-warping operation implicitly. The haar-like descriptor is used to extract the visual features of facial appearance, and Adaboost is performed to learn the final classifiers. Extensive experiments carried on the Cohn-Kanade database show the promising performance of the proposed method.