Improv: a system for scripting interactive actors in virtual worlds
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
ALMA: a layered model of affect
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
SmartBody: behavior realization for embodied conversational agents
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
The Behavior Markup Language: Recent Developments and Challenges
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Greta: an interactive expressive ECA system
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 2
EMBR --- A Realtime Animation Engine for Interactive Embodied Agents
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
The SEMAINE API: towards a standards-based framework for building emotion-oriented systems
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
REAL-TIME ANIMATION OF INTERACTIVE AGENTS: SPECIFICATION AND REALIZATION
Applied Artificial Intelligence - Intelligent Virtual Agents
Sign language avatars: animation and comprehensibility
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Multimodal plan representation for adaptable BML scheduling
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Modeling parallel state charts for multithreaded multimodal dialogues
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Towards ECA's animation of expressive complex behaviour
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
Hi-index | 0.00 |
Generating coordinated multimodal behavior for an embodied agent (speech, gesture, facial expression...) is challenging. It requires a high degree of animation control, in particular when reactive behaviors are required. We suggest to distinguish realization planning, where gesture and speech are processed symbolically using the behavior markup language (BML), and presentation which is controlled by a lower level animation language (EMBRScript). Reactive behaviors can bypass planning and directly control presentation. In this paper, we show how to define a behavior lexicon, how this lexicon relates to BML and how to resolve timing using formal constraint solvers. We conclude by demonstrating how to integrate reactive emotional behaviors.