Animation control for real-time virtual humans
Communications of the ACM
Emotion and personality in a conversational agent
Embodied conversational agents
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Toward a New Generation of Virtual Humans for Interactive Experiences
IEEE Intelligent Systems
An Affective Module for an Intelligent Tutoring System
ITS '02 Proceedings of the 6th International Conference on Intelligent Tutoring Systems
Eye communication in a conversational 3D synthetic agent
AI Communications
A gaze contingent environment for fostering social attention in autistic children
Proceedings of the 2004 symposium on Eye tracking research & applications
Special Education and Rehabilitation: Teaching and Healing with Interactive Graphics
IEEE Computer Graphics and Applications
A Virtual Reality Patient and Environments for Image Guided Diagnosis
MIAR '08 Proceedings of the 4th international workshop on Medical Imaging and Augmented Reality
Artificial Intelligence Review
Exact eye contact with virtual humans
HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
Accurate behaviour and believability of computer generated images of human head
Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
Affective intelligence: a novel user interface paradigm
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Hi-index | 0.00 |
This article presents a multimodal human-computer interface that employs photorealistic virtual humans that can talk, emote, and act adaptively and intelligently in response to the actions of a user in front of a computer screen. The authors implemented this virtual human interface (VHI) system in a high-performance real-time, visual environment. This article describes novel 3D facial modeling and animation techniques used to design virtual faces capable of delivering fine details of metacommunication and supporting verbal content. It also introduces an artificial expression space representation to control the emotions and expressions of the animated digital human characters. The article describes the use of intelligent sensory mechanisms-such as vision, hearing, and touching-and presents special purpose sensory modules-such as face recognition and expression analysis. Experimental results of the VHI environment and practical application areas, including the first holographic virtual human, are discussed.