Fourier principles for emotion-based human figure animation
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Improv: a system for scripting interactive actors in virtual worlds
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
GI '96 Proceedings of the conference on Graphics interface '96
Personality-rich believable agents that use language
AGENTS '97 Proceedings of the first international conference on Autonomous agents
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Fuzzy System Design Principles
Fuzzy System Design Principles
A multilayer personality model
Proceedings of the 2nd international symposium on Smart graphics
FLAME—Fuzzy Logic Adaptive Model of Emotions
Autonomous Agents and Multi-Agent Systems
Real Time Responsive Animation with Personality
IEEE Transactions on Visualization and Computer Graphics
Verbs and Adverbs: Multidimensional Motion Interpolation
IEEE Computer Graphics and Applications
Generation of Facial Expressions from Emotion Using a Fuzzy Rule Based System
AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Interactive Storytelling: Techniques for 21st Century Fiction
Interactive Storytelling: Techniques for 21st Century Fiction
Personalised Real-Time Idle Motion Synthesis
PG '04 Proceedings of the Computer Graphics and Applications, 12th Pacific Conference
Computer Animation and Virtual Worlds - Special Issue: The Very Best Papers from CASA 2004
High-level control posture of story characters based on personality and emotion
Proceedings of the second Australasian conference on Interactive entertainment
Modeling emotions and other motivations in synthetic agents
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
High-level control posture of story characters based on personality and emotion
Proceedings of the second Australasian conference on Interactive entertainment
Simulation of individual spontaneous reactive behavior
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Computational Affective Sociology
Affect and Emotion in Human-Computer Interaction
SceneMaker: automatic visualisation of screenplays
KI'09 Proceedings of the 32nd annual German conference on Advances in artificial intelligence
SceneMaker: multimodal visualisation of natural language film scripts
KES'10 Proceedings of the 14th international conference on Knowledge-based and intelligent information and engineering systems: Part IV
SceneMaker: intelligent multimodal visualization of natural language scripts
AICS'09 Proceedings of the 20th Irish conference on Artificial intelligence and cognitive science
Providing gender to embodied conversational agents
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
An interdisciplinary VR-architecture for 3D chatting with non-verbal communication
EGVE - JVRC'11 Proceedings of the 17th Eurographics conference on Virtual Environments & Third Joint Virtual Reality
Hi-index | 0.00 |
Human emotional behavior, personality, and body language are the essential elements in the recognition of a believable synthetic story character. This paper presents an approach using story scripts and action descriptions in a form similar to the content description of storyboards to predict specific personality and emotional states. By adopting the Abridged Big Five Circumplex (AB5C) Model of personality from the study of psychology as a basis for a computational model, we construct a hierarchical fuzzy rule-based system to facilitate the personality and emotion control of the body language of a dynamic story character. The story character can consistently perform specific postures and gestures based on his/her personality type. Story designers can devise a story context in the form of our story interface which predictably motivates personality and emotion values to drive the appropriate movements of the story characters. Our system takes advantage of relevant knowledge described by psychologists and researchers of storytelling, nonverbal communication, and human movement. Our ultimate goal is to facilitate the high-level control of a synthetic character.