Animated Talking Head with Personalized 3D Head Model

  • Authors:
  • Jörn Ostermann;Lawrence S. Chen;Thomas S. Huang

  • Affiliations:
  • AT&T Labs—Research, Room 3-231, 100 Schultz Dr., Red Bank, NJ, 07701;Beckman Institute CSL, University of Urbana, IL 61801;Beckman Institute CSL, University of Urbana, IL 61801

  • Venue:
  • Journal of VLSI Signal Processing Systems - special issue on multimedia signal processing
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

Natural Human-Computer Interface requires integrationof realistic audio and visual information forperception and display. An example of such an interface is ananimated talking head displayed on the computer screen in theform of a human-like computer agent. This system converts textto acoustic speech with synchronized animation of mouthmovements. The talking head is based on a generic 3D human headmodel, but to improve realism, natural looking personalizedmodels are necessary. In this paper, we report a semi-automaticmethod for adapting a generic head model to 3D range data of ahuman head obtained from a 3D-laser range scanner. Thispersonalized model is incorporated into the talking head system. With texture mapping, the personalized model offers a morenatural and realistic look than the generic model. The modelcreated with the proposed method compares favorable to genericmodels.