Distributing expressional faces in 2-D emotional space

  • Authors:
  • Yangzhou Du;Wenyuan Bi;Tao Wang;Yimin Zhang;Haizhou Ai

  • Affiliations:
  • Intel China Research Center, Beijing, P. R. China;Tsinghua University, Beijing, P. R. China;Intel China Research Center, Beijing, P. R. China;Intel China Research Center, Beijing, P. R. China;Tsinghua University, Beijing, P. R. China

  • Venue:
  • Proceedings of the 6th ACM international conference on Image and video retrieval
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Facial expressions are often classified into one of several basic emotion categories. This categorical approach seems improper to treat faces with blended emotion, as well as hard to measure the intensity of an emotion. In this paper facial expressions are evaluated with dimensional approach of affect that was originally introduced by psycho-physiologic study. An expressional face can be represented as a point in a two-dimensional (2-D) emotional space characterized by arousal and valence factors. To link low-level face features with emotional factors, we propose a simple method that builds an emotional mapping by a coarse labeling on Cohn-Kanade database and a linear fitting on the labeled data. Our preliminary experimental result shows that the proposed emotional mapping can be used to visualize the distribution of affective content in a large face set and further retrieval expressional face images or relevant video shots by specifying a region in the 2-D emotional space.