A muscle model for animation three-dimensional facial expression
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
Layered construction for deformable animated characters
SIGGRAPH '89 Proceedings of the 16th annual conference on Computer graphics and interactive techniques
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Facial animation framework for the web and mobile platforms
Proceedings of the seventh international conference on 3D Web technology
Facial Deformations for MPEG-4
CA '98 Proceedings of the Computer Animation
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
MPEG-4 Facial Animation: The Standard, Implementation and Applications
MPEG-4 Facial Animation: The Standard, Implementation and Applications
A parametric model for human faces.
A parametric model for human faces.
IEEE Transactions on Circuits and Systems for Video Technology
LiveMail: personalized avatars for mobile entertainment
Proceedings of the 3rd international conference on Mobile systems, applications, and services
Levels of representation in the annotation of emotion for the specification of expressivity in ECAs
Lecture Notes in Computer Science
Modeling and animating for the dense laser-scanned face in the low resolution level
MS'06 Proceedings of the 17th IASTED international conference on Modelling and simulation
Facial motion cloning with radial basis functions in MPEG-4 FBA
Graphical Models
Hi-index | 0.00 |
We propose a method for automatically copying facial motion from one 3D face model to another, while preserving the compliance of the motion to the MPEG-4 Face and Body Animation (FBA) standard. Despite the enormous progress in the field of Facial Animation, producing a new animatable face from scratch is still a tremendous task for an artist. Although many methods exist to animate a face automatically based on procedural methods, these methods still need to be initialized by defining facial regions or similar, and they lack flexibility because the artist can only obtain the facial motion that a particular algorithm offers. Therefore a very common approach is interpolation between key facial expressions, usually called morph targets, containing either speech elements (visemes) or emotional expressions. Following the same approach, the MPEG-4 Facial Animation specification offers a method for interpolation of facial motion from key positions, called Facial Animation Tables, which are essentially morph targets corresponding to all possible motions specified in MPEG-4. The problem of this approach is that the artist needs to create a new set of morph targets for each new face model. In case of MPEG-4 there are 86 morph targets, which is a lot of work to create manually. Our method solves this problem by cloning the morph targets, i.e. by automatically copying the motion of vertices, as well as geometry transforms, from source face to target face while maintaining the regional correspondences and the correct scale of motion. It requires the user only to identify a subset of the MPEG-4 Feature Points in the source and target faces. The scale of the movement is normalized with respect to MPEG-4 normalization units (FAPUs), meaning that the MPEG-4 FBA compliance of the copied motion is preserved. Our method is therefore suitable not only for cloning of free facial expressions, but also of MPEG-4 compatible facial motion, in particular the Facial Animation Tables. We believe that Facial Motion Cloning offers dramatic time saving to artists producing morph targets for facial animation or MPEG-4 Facial Animation Tables.