Technical Section: Expression modeling-a boundary element approach

  • Authors:
  • K. C. Hui;H. C. Leung

  • Affiliations:
  • Department of Automation and Computer-Aided Engineering, CAD Laboratory, The Chinese University of Hong Kong, Shatin, Hong Kong;Department of Automation and Computer-Aided Engineering, CAD Laboratory, The Chinese University of Hong Kong, Shatin, Hong Kong

  • Venue:
  • Computers and Graphics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Popular techniques for modeling facial expression usually rely on the shape blending of a series of pre-defined facial models, the use of feature parameters, or the use of an anatomy based facial model. This requires extensive user interaction to construct the pre-defined facial model, the deformation functions, or the anatomy based facial model. Besides, existing anatomy based facial modeling techniques are targeted for human facial model and may not be used directly for non-human like character models. This paper presents an intuitive technique for the design of facial expressions using a physics based deformation approach. The technique does not require specifying the deformation function associated with facial feature parameters, and does not require a detail anatomical model of the head. By adjusting the contraction or relaxation of a set of facial muscles, different facial expressions can be obtained. Facial muscles and skin are assumed to be linearly elastic. The boundary element method (BEM) is adopted for evaluating deformation of the facial skin. This avoids the use of volumetric elements as in the case of finite element method (FEM) or the setting up of complex mass-spring models. Given a polygon mesh of a facial model, a closed volume of the facial mesh is obtained by offsetting the polygon mesh according to a user defined depth map. Each facial muscle is approximated with a series of muscle polygons on the mesh surface. Deformation of the facial mesh is attained by stretching or compressing the muscle polygons. By pre-computing the inverse of the stiffness matrix, interactive editing of facial expression can be achieved.