Designing an expressive avatar of a real person

  • Authors:
  • Sangyoon Lee;Gordon Carlson;Steve Jones;Andrew Johnson;Jason Leigh;Luc Renambot

  • Affiliations:
  • Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL;Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL;Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL;Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL;Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL;Electronic Visualization Laboratory, University of Illinois at Chicago, Chicago, IL

  • Venue:
  • IVA'10 Proceedings of the 10th international conference on Intelligent virtual agents
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The human ability to express and recognize emotions plays an important role in face-to-face communication, and as technology advances it will be increasingly important for computer-generated avatars to be similarly expressive. In this paper, we present the detailed development process for the Lifelike Responsive Avatar Framework (LRAF) and a prototype application for modeling a specific individual to analyze the effectiveness of expressive avatars. In particular, the goals of our pilot study (n = 1,744) are to determine whether the specific avatar being developed is capable of conveying emotional states (Ekmans six classic emotions) via facial features and whether a realistic avatar is an appropriate vehicle for conveying the emotional states accompanying spoken information. The results of this study show that happiness and sadness are correctly identified with a high degree of accuracy while the other four emotional states show mixed results.