International Journal of Human-Computer Studies
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
The automated design of believable dialogues for animated presentation teams
Embodied conversational agents
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Formational Parameters and Adaptive Prototype Instantiation for MPEG-4 Compliant Gesture Synthesis
CA '02 Proceedings of the Computer Animation
A flexible pragmatics-driven language generator for animated agents
EACL '03 Proceedings of the tenth conference on European chapter of the Association for Computational Linguistics - Volume 2
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
AER: aesthetic exploration and refinement for expressive character animation
Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Modeling self-efficacy in intelligent tutoring systems: An inductive approach
User Modeling and User-Adapted Interaction
SmartBody: behavior realization for embodied conversational agents
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
The Politeness Effect: Pedagogical Agents and Learning Gains
Proceedings of the 2005 conference on Artificial Intelligence in Education: Supporting Learning through Intelligent and Socially Informed Technology
Interactive editing of motion style using drives and correlations
Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
EMBR --- A Realtime Animation Engine for Interactive Embodied Agents
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
Real-time prosody-driven synthesis of body language
ACM SIGGRAPH Asia 2009 papers
Implementing expressive gesture synthesis for embodied conversational agents
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Don't scratch! self-adaptors reflect emotional stability
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Using individual light rigs to control the perception of a virtual character's personality
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Proceedings of the 6th International Conference on Foundations of Digital Games
IVA'12 Proceedings of the 12th international conference on Intelligent Virtual Agents
Multimodal analysis of body communication cues in employment interviews
Proceedings of the 15th ACM on International conference on multimodal interaction
Hi-index | 0.00 |
A significant goal in multi-modal virtual agent research is to determine how to vary expressive qualities of a character so that it is perceived in a desired way. The "Big Five" model of personality offers a potential framework for organizing these expressive variations. In this work, we focus on one parameter in this model - extraversion - and demonstrate how both verbal and non-verbal factors impact its perception. Relevant findings from the psychology literature are summarized. Based on these, an experiment was conducted with a virtual agent that demonstrates how language generation, gesture rate and a set of movement performance parameters can be varied to increase or decrease the perceived extraversion. Each of these factors was shown to be significant. These results offer guidance to agent designers on how best to create specific characters.