A similarity-based neural network for facial expression analysis

  • Authors:
  • Kenji Suzuki;Hiroshi Yamada;Shuji Hashimoto

  • Affiliations:
  • Department of Intelligent Interaction Technologies, University of Tsukuba, Japan;Department of Psychology, Nihon University, Japan;Department of Applied Physics, Waseda University, Japan

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2007

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper, we introduce a novel model for the measuring of human subjective evaluation by using Relevance Learning based on a similarity-based multilayer perceptron. This work aims to achieve a multidimensional perceptual scaling that associates the physical features of a face with its semantic vector in a low-dimensional space. Unlike the conventional multilayer perceptron that learns from a set of an input feature vector and the desired output, the proposed network can obtain a nonlinear mapping between the input feature vectors and the outputs from a pair of objects and their desired relevance (distance). We conducted a facial expression analysis with both a psychological model of line-drawing image of facial expression and a real image set. Regarding the construction of semantic space, the proposed approach not only shows a good performance as compared with the conventional statistical method but is also able to project new data that are not used during the training phase. We will show some experimental results and discuss the obtained mapping function.