Multimodal behavior realization for embodied conversational agents

  • Authors:
  • Aleksandra Čereković;Igor S. Pandžić

  • Affiliations:
  • Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia;Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Applications with intelligent conversational virtual humans, called Embodied Conversational Agents (ECAs), seek to bring human-like abilities into machines and establish natural human-computer interaction. In this paper we discuss realization of ECA multimodal behaviors which include speech and nonverbal behaviors. We devise RealActor, an open-source, multi-platform animation system for real-time multimodal behavior realization for ECAs. The system employs a novel solution for synchronizing gestures and speech using neural networks. It also employs an adaptive face animation model based on Facial Action Coding System (FACS) to synthesize face expressions. Our aim is to provide a generic animation system which can help researchers create believable and expressive ECAs.