Coding Facial Expressions with Gabor Wavelets
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Visual Prosody: Facial Movements Accompanying Speech
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Emotional facial expression model building
Pattern Recognition Letters
The reliability and validity of the chinese version of abbreviated PAD emotion scales
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
An efficient use of MPEG-4 FAP interpolation for facial animation at 70 bits/frame
IEEE Transactions on Circuits and Systems for Video Technology
Hi-index | 0.00 |
Facial expression plays an important role in face to face communication in that it conveys nonverbal information and emotional intent beyond speech. In this paper, an approach for facial expression synthesis with an expressive Chinese talking avatar is proposed, where a layered parametric framework is designed to synthesize intermediate facial expressions using PAD emotional parameters [5], which describe the human emotional state with three nearly orthogonal dimensions. Partial Expression Parameter (PEP) is proposed to depict the facial expression movements in specific face regions, which act as the mid-level expression parameters between the low-level Facial Animation Parameters (FAPs) [11] and the high-level PAD emotional parameters. A pseudo facial expression database is established by cloning the real human expression to avatar and the corresponding emotion states for each expression is annotated using PAD score. An emotion-expression mapping model is trained on the database to map the emotion state (PAD) into facial expression configuration (PEP). Perceptual evaluation shows the input PAD value is consistent with that of human perception on synthetic expression, which supports the effectiveness of our approach.