A model of gaze for the purpose of emotional expression in virtual embodied agents

  • Authors:
  • Brent J. Lance;Stacy C. Marsella

  • Affiliations:
  • University of Southern California, Marina Del Rey, CA;University of Southern California, Marina Del Rey, CA

  • Venue:
  • Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Currently, state of the art virtual agents lack the ability to display emotion as seen in actual humans, or even in hand-animated characters. One reason for the emotional inexpressiveness of virtual agents is the lack of emotionally expressive gaze manner. For virtual agents to express emotion that observers can empathize with, they need to generate gaze - including eye, head, and torso movement - to arbitrary targets, while displaying arbitrary emotional states. Our previous work [18] describes the Gaze Warping Transformation, a method of generating emotionally expressive head and torso movement during gaze shifts that is derived from human movement data. Through an evaluation, it was shown that applying different transformations to the same gaze shift could modify the affective state perceived when the transformed gaze shift was viewed by a human observer. In this paper we propose a model of realistic, emotionally expressive gaze that builds upon the Gaze Warping Transformation by improving the transformation implementation, and by adding a model of eye movement drawn from the visual neuroscience literature. We describe how to generate a gaze to an arbitrary target, while displaying an arbitrary emotional behavior. Finally, we propose an evaluation to determine what emotions human observers will attribute to the generated gaze shifts. Once this work is completed, virtual agents will have access to a new channel for emotionally expressive behavior.