SIGGRAPH '86 Proceedings of the 13th annual conference on Computer graphics and interactive techniques
Performance-driven facial animation
SIGGRAPH '90 Proceedings of the 17th annual conference on Computer graphics and interactive techniques
Realistic modeling for facial animation
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
A morphable model for the synthesis of 3D faces
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Perception-guided global illumination solution for animation rendering
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Head shop: generating animated head models with anatomical structure
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Trainable videorealistic speech animation
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Measuring Visual Shape using Computer Graphics Psychophysics
Proceedings of the Eurographics Workshop on Rendering Techniques 2000
How Believable Are Real Faces? Towards a Perceptual Basis for Conversational Animation
CASA '03 Proceedings of the 16th International Conference on Computer Animation and Social Agents (CASA 2003)
Perceiving translucent materials
APGV '04 Proceedings of the 1st Symposium on Applied perception in graphics and visualization
The components of conversational facial expressions
APGV '04 Proceedings of the 1st Symposium on Applied perception in graphics and visualization
Towards perceptually realistic talking heads: models, methods and McGurk
APGV '04 Proceedings of the 1st Symposium on Applied perception in graphics and visualization
Parameterized Models for Facial Animation
IEEE Computer Graphics and Applications
Psychophysical investigation of facial expressions using computer animated faces
Proceedings of the 4th symposium on Applied perception in graphics and visualization
The interaction between motion and form in expression recognition
Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization
Perceptually guided expressive facial animation
Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Eyelid image synthesis using motion dependent texture mapping
VIIP '07 The Seventh IASTED International Conference on Visualization, Imaging and Image Processing
Perception of linear and nonlinear motion properties using a FACS validated 3D facial model
Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization
EuroVis'11 Proceedings of the 13th Eurographics / IEEE - VGTC conference on Visualization
Evaluating facial expressions in american sign language animations for accessible online information
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
Hi-index | 0.00 |
The human face is capable of producing an astonishing variety of expressions—expressions for which sometimes the smallest difference changes the perceived meaning considerably. Producing realistic-looking facial animations that are able to transmit this degree of complexity continues to be a challenging research topic in computer graphics. One important question that remains to be answered is: When are facial animations good enough? Here we present an integrated framework in which psychophysical experiments are used in a first step to systematically evaluate the perceptual quality of several different computer-generated animations with respect to real-world video sequences. The first experiment provides an evaluation of several animation techniques, exposing specific animation parameters that are important to achieve perceptual fidelity. In a second experiment, we then use these benchmarked animation techniques in the context of perceptual research in order to systematically investigate the spatiotemporal characteristics of expressions. A third and final experiment uses the quality measures that were developed in the first two experiments to examine the perceptual impact of changing facial features to improve the animation techniques. Using such an integrated approach, we are able to provide important insights into facial expressions for both the perceptual and computer graphics community.