Neural networks: a systematic introduction
Neural networks: a systematic introduction
BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Formational Parameters and Adaptive Prototype Instantiation for MPEG-4 Compliant Gesture Synthesis
CA '02 Proceedings of the Computer Animation
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
Automated methods for data-driven synthesis of realistic and controllable human motion
Automated methods for data-driven synthesis of realistic and controllable human motion
Proceedings of the 2007 symposium on Interactive 3D graphics and games
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Fully generated scripted dialogue for embodied agents
Artificial Intelligence
SmartBody: behavior realization for embodied conversational agents
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
The Behavior Markup Language: Recent Developments and Challenges
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Towards a Multicultural ECA Tour Guide System
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Towards Natural Head Movement of Autonomous Speaker Agent
KES '08 Proceedings of the 12th international conference on Knowledge-Based Intelligent Information and Engineering Systems, Part II
Towards Facial Gestures Generation by Speech Signal Analysis Using HUGE Architecture
Multimodal Signals: Cognitive and Algorithmic Issues
Studies on gesture expressivity for a virtual agent
Speech Communication
Towards a common framework for multimodal generation: the behavior markup language
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Nonverbal behavior generator for embodied conversational agents
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
[HUGE]: universal architecture for statistically based HUman GEsturing
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Human perception of jitter and media synchronization
IEEE Journal on Selected Areas in Communications
Multimodal behavior realization for embodied conversational agents
Multimedia Tools and Applications
Sketch-based interfaces: drawings to data
Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference
Hi-index | 0.00 |
The Embodied Conversational Agents (ECAs) are an application of virtual characters that is subject of considerable ongoing research. An essential prerequisite for creating believable ECAs is the ability to describe and visually realize multimodal conversational behaviors. The recently developed Behavior Markup Language (BML) seeks to address this requirement by granting a means to specify physical realizations of multimodal behaviors through human-readable scripts. In this paper we present an approach to implement a behavior realizer compatible with BML language. The system’s architecture is based on hierarchical controllers which apply preprocessed behaviors to body modalities. Animation database is feasibly extensible and contains behavior examples constructed upon existing lexicons and theory of gestures. Furthermore, we describe a novel solution to the issue of synchronizing gestures with synthesized speech using neural networks and propose improvements to the BML specification.