A muscle model for animation three-dimensional facial expression
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
Principles of traditional animation applied to 3D computer animation
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
Realistic modeling for facial animation
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Computer facial animation
Coding, Analysis, Interpretation, and Recognition of Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Synthesizing realistic facial expressions from photographs
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
A morphable model for the synthesis of 3D faces
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
International Journal of Human-Computer Studies
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Animation of Synthetic Faces in MPEG-4
CA '98 Proceedings of the Computer Animation
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
A parametric model for human faces.
A parametric model for human faces.
Creating Interactive Virtual Humans: Some Assembly Required
IEEE Intelligent Systems
Automatic determination of facial muscle activations from sparse motion capture marker data
ACM SIGGRAPH 2005 Papers
Simulating speech with a physics-based facial muscle model
Proceedings of the 2006 ACM SIGGRAPH/Eurographics symposium on Computer animation
Technical Section: Expression modeling-a boundary element approach
Computers and Graphics
A framework for socially communicative faces for game and interactive learning applications
Future Play '07 Proceedings of the 2007 conference on Future Play
Computers in Entertainment (CIE) - SPECIAL ISSUE: Games
Copying behaviour of expressive motion
MIRAGE'07 Proceedings of the 3rd international conference on Computer vision/computer graphics collaboration techniques
Presenting in style by virtual humans
COST 2102'07 Proceedings of the 2007 COST action 2102 international conference on Verbal and nonverbal communication behaviours
Gamer's facial cloning for online interactive games
International Journal of Computer Games Technology - Special issue on cyber games and interactive entertainment
An affective user interface based on facial expression recognition and eye-gaze tracking
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
3-D facial expression recognition-synthesis on PDA incorporating emotional timing
PCM'04 Proceedings of the 5th Pacific Rim Conference on Advances in Multimedia Information Processing - Volume Part II
Multimodal sensing, interpretation and copying of movements by a virtual agent
PIT'06 Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies
Hi-index | 0.00 |
We propose a control mechanism for facial expressions by applying a few carefully chosen parametric modifications to pre-existing expression data streams. This approach applies to any facial animation resource expressed in the general MPEG-4 form, whether taken from a library of preset facial expressions, captured from live performance, or entirely manually created. The MPEG-4 Facial Animation Parameters (FAPs) represent a facial expression as a set of parameterized muscle actions, given as intensity of individual muscle movements over time. Our system varies expressions by changing the intensities and scope of sets of MPEG-4 FAPs. It creates variations in "expressiveness" across the face model rather than simply scale, interpolate, or blend facial mesh node positions. The parameters are adapted from the Effort parameters of Laban Movement Analysis (LMA); we developed a mapping from their values onto sets of FAPs. The FacEMOTE parameters thus perturb a base expression to create a wide range of expressions. Such an approach could allow real-time face animations to change underlying speech or facial expression shapes dynamically according to current agent affect or user interaction needs.