Computer facial animation
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
EGVE '02 Proceedings of the workshop on Virtual environments 2002
Design of a Virtual Human Presenter
IEEE Computer Graphics and Applications
Technical Section: Facial animation based on context-dependent visemes
Computers and Graphics
Towards a common framework for multimodal generation: the behavior markup language
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Nonverbal behavior generator for embodied conversational agents
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Hi-index | 0.00 |
Virtual presenters have a great range of possible applications, like teachers, news presenters and guides of virtual environments, easing the interaction with computers. The animation control of such virtual characters is usually accomplished by an animation script describing all the movements to be performed. Writing a convincing animation script is a demanding and cumbersome task. To ease the animation process, we propose the additional use of a behavior model learned from a real presenter. The article presents the implementation of a 3D virtual news presenter that implicitly follows a behavior model and a script that describes the text to be uttered. The behavior model is given by a set of behavioral rules that represent common non-verbal facial movement patterns displayed by a real presenter whose TV appearances were analyzed