Modeling and evaluating empathy in embodied companion agents

  • Authors:
  • Scott W. McQuiggan;James C. Lester

  • Affiliations:
  • Department of Computer Science, North Carolina State University, 890 Oval Drive, Raleigh, NC 27695, USA;Department of Computer Science, North Carolina State University, 890 Oval Drive, Raleigh, NC 27695, USA

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Affective reasoning plays an increasingly important role in cognitive accounts of social interaction. Humans continuously assess one another's situational context, modify their own affective state accordingly, and then respond to these outcomes by expressing empathetic behaviors. Synthetic agents serving as companions should respond similarly. However, empathetic reasoning is riddled with the complexities stemming from the myriad factors bearing upon situational assessment. A key challenge posed by affective reasoning in synthetic agents is devising empirically informed models of empathy that accurately respond in social situations. This paper presents Care, a data-driven affective architecture and methodology for learning models of empathy by observing human-human social interactions. First, in Care training sessions, one trainer directs synthetic agents to perform a sequence of tasks while another trainer manipulates companion agents' affective states to produce empathetic behaviors (spoken language, gesture, and posture). Care tracks situational data including locational, intentional, and temporal information to induce a model of empathy. At runtime, Care uses the model of empathy to drive situation-appropriate empathetic behaviors. Care has been used in a virtual environment testbed. Two complementary studies investigating the predictive accuracy and perceived accuracy of Care-induced models of empathy suggest that the Care paradigm can provide the basis for effective empathetic behavior control in embodied companion agents.