Studies on gesture expressivity for a virtual agent

  • Authors:
  • Catherine Pelachaud

  • Affiliations:
  • IUT de Montreuil, Université de Paris 8, INRIA Paris-Rocquencourt, Paris

  • Venue:
  • Speech Communication
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Our aim is to create an affective embodied conversational agent (ECA); that is an ECA able to display communicative and emotional signals. Nonverbal communication is done through certain facial expressions, gesture shapes, gaze direction, etc. But it can also carry a qualitative aspect through behavior expressivity: how a facial expression, a gesture is executed. In this paper we describe some of the work we have conducted on behavior expressivity, more particularly on gesture expressivity. We have developed a model of behavior expressivity using a set of six parameters that act as modulation of behavior animation. Expressivity may act at different levels of the behavior: on a particular phase of the behavior, on the whole behavior and on a sequence of behaviors. When applied at these different levels, expressivity may convey different functions.