A physically-based motion retargeting filter

  • Authors:
  • Seyoon Tak;Hyeong-Seok Ko

  • Affiliations:
  • Samsung Advanced Institute of Technology, Yongin-si, Korea;Seoul National University, Seoul, Korea

  • Venue:
  • ACM Transactions on Graphics (TOG)
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a novel constraint-based motion editing technique. On the basis of animator-specified kinematic and dynamic constraints, the method converts a given captured or animated motion to a physically plausible motion. In contrast to previous methods using spacetime optimization, we cast the motion editing problem as a constrained state estimation problem, based on the per-frame Kalman filter framework. The method works as a filter that sequentially scans the input motion to produce a stream of output motion frames at a stable interactive rate. Animators can tune several filter parameters to adjust to different motions, turn the constraints on or off based on their contributions to the final result, or provide a rough sketch (kinematic hint) as an effective way of producing the desired motion. Experiments on various systems show that the technique processes the motions of a human with 54 degrees of freedom, at about 150 fps when only kinematic constraints are applied, and at about 10 fps when both kinematic and dynamic constraints are applied. Experiments on various types of motion show that the proposed method produces remarkably realistic animations.