Embodiment in conversational interfaces: Rea
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Analysis and Synthesis of Facial Image Sequences Using Physical and Anatomical Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Crafting the Illusion of Meaning: Template-Based Specification of Embodied Conversational Behavior
CASA '03 Proceedings of the 16th International Conference on Computer Animation and Social Agents (CASA 2003)
MPEG-4 Facial Animation: The Standard, Implementation and Applications
MPEG-4 Facial Animation: The Standard, Implementation and Applications
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Combination of Facial Movements on a 3D Talking Head
CGI '04 Proceedings of the Computer Graphics International
Methods for exploring expressive stance
SCA '04 Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation
Synchronization for dynamic blending of motions
SCA '04 Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
Learning physics-based motion style with nonlinear inverse optimization
ACM SIGGRAPH 2005 Papers
Design and evaluation of expressive gesture synthesis for embodied conversational agents
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
Multimodal expressive embodied conversational agents
Proceedings of the 13th annual ACM international conference on Multimedia
From brows to trust
Evaluation of multimodal behaviour of embodied agents
From brows to trust
Parameterized facial expression synthesis based on MPEG-4
EURASIP Journal on Applied Signal Processing
Perception of blended emotions: from video corpus to expressive agent
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Gesture expressivity modulations in an ECA application
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Annotating multimodal behaviors occurring during non basic emotions
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Evaluating Emotive Character Animations Created with Procedural Animation
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Multimodal behavior realization for embodied conversational agents
Multimedia Tools and Applications
Interdisciplinary contributions to flame modeling
AI*IA'11 Proceedings of the 12th international conference on Artificial intelligence around man and beyond
COST'09 Proceedings of the Second international conference on Development of Multimodal Interfaces: active Listening and Synchrony
An NVC emotional model for conversational virtual humans in a 3d chatting environment
AMDO'12 Proceedings of the 7th international conference on Articulated Motion and Deformable Objects
AMDO'12 Proceedings of the 7th international conference on Articulated Motion and Deformable Objects
Providing training in GSD by using a virtual environment
PROFES'12 Proceedings of the 13th international conference on Product-Focused Software Process Improvement
An interdisciplinary VR-architecture for 3D chatting with non-verbal communication
EGVE - JVRC'11 Proceedings of the 17th Eurographics conference on Virtual Environments & Third Joint Virtual Reality
Joint activity theory as a framework for natural body expression in autonomous agents
Proceedings of the 1st International Workshop on Multimodal Learning Analytics
Using speech data to recognize emotion in human gait
HBU'12 Proceedings of the Third international conference on Human Behavior Understanding
Hi-index | 0.00 |
Our aim is to create an affective embodied conversational agent (ECA); that is an ECA able to display communicative and emotional signals. Nonverbal communication is done through certain facial expressions, gesture shapes, gaze direction, etc. But it can also carry a qualitative aspect through behavior expressivity: how a facial expression, a gesture is executed. In this paper we describe some of the work we have conducted on behavior expressivity, more particularly on gesture expressivity. We have developed a model of behavior expressivity using a set of six parameters that act as modulation of behavior animation. Expressivity may act at different levels of the behavior: on a particular phase of the behavior, on the whole behavior and on a sequence of behaviors. When applied at these different levels, expressivity may convey different functions.