Spatial relationship preserving character motion adaptation

  • Authors:
  • Edmond S. L. Ho;Taku Komura;Chiew-Lan Tai

  • Affiliations:
  • The University of Edinburgh;The University of Edinburgh;Hong Kong University of Science and Technology

  • Venue:
  • ACM SIGGRAPH 2010 papers
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a new method for editing and retargeting motions that involve close interactions between body parts of single or multiple articulated characters, such as dancing, wrestling, and sword fighting, or between characters and a restricted environment, such as getting into a car. In such motions, the implicit spatial relationships between body parts/objects are important for capturing the scene semantics. We introduce a simple structure called an interaction mesh to represent such spatial relationships. By minimizing the local deformation of the interaction meshes of animation frames, such relationships are preserved during motion editing while reducing the number of inappropriate interpenetrations. The interaction mesh representation is general and applicable to various kinds of close interactions. It also works well for interactions involving contacts and tangles as well as those without any contacts. The method is computationally efficient, allowing real-time character control. We demonstrate its effectiveness and versatility in synthesizing a wide variety of motions with close interactions.