Animating a conversational agent with user expressivity

  • Authors:
  • M. K. Rajagopal;P. Horain;C. Pelachaud

  • Affiliations:
  • Institut Telecom, Telecom SudParis, Évry Cedex, France;Institut Telecom, Telecom SudParis, Évry Cedex, France;CNRS Telecom ParisTech, Paris, France

  • Venue:
  • IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Our objective is to animate an embodied conversational agent (ECA) with communicative gestures rendered with the expressivity of a real human user it represents. We describe an approach to estimate a subset of expressivity parameters defined in the literature (namely spatial and temporal extent) from captured motion trajectories. We first validate this estimation against synthesis motion and then show results with real human motion. The estimated expressivity is then sent to the animation engine of an ECA that becomes a personalized autonomous representative of that user.