Facial expressional image synthesis controlled by emotional parameters

  • Authors:
  • Chuan Zhou;Xueyin Lin

  • Affiliations:
  • Department of Computer Science and Technology, Tsinghua University, Key Laboratory of Pervasive Computing, Ministry of Education, Beijing 100084, PR China;Department of Computer Science and Technology, Tsinghua University, Key Laboratory of Pervasive Computing, Ministry of Education, Beijing 100084, PR China

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2005

Quantified Score

Hi-index 0.10

Visualization

Abstract

Facial expressional image synthesis is an important technique in human-computer interaction, but it is also a difficult task, especially to make various realistic expressive face images with a flexible control mechanism. In this paper, a novel parameter driven method of synthesizing realistic comprehensive expressional image is proposed. In this method a kernel-based bi-factor factorization model is adopted to decompose two influence factors, identity and expression, and their interaction from a small training database. Therefore the facial expressional images in the training database can be covered by corresponding identity and expression vectors and their interaction matrix. Comprehensive expression image can be manipulated by means of linear combination of these basis expression vectors in a flexible manner. In order to make this trained model be capable of producing realistic expressional images of any person outside the training set, expression ratio image (ERI) and relative shape description are joined into our model to enhance the expressive power of our method. Experimental results show that realistic facial expressional image can be synthesized successfully from only one picture of someone who is quite different from persons in the training database, and controlled by a parameter vector efficiently and effectively.