Interactive generation of human animation with deformable motion models

  • Authors:
  • Jianyuan Min;Yen-Lin Chen;Jinxiang Chai

  • Affiliations:
  • Texas A&M University, College Station, TX;Texas A&M University, College Station, TX;Texas A&M University, College Station, TX

  • Venue:
  • ACM Transactions on Graphics (TOG)
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a new motion model deformable motion models for human motion modeling and synthesis. Our key idea is to apply statistical analysis techniques to a set of precaptured human motion data and construct a low-dimensional deformable motion model of the form x = M(α, γ), where the deformable parameters α and γ control the motion's geometric and timing variations, respectively. To generate a desired animation, we continuously adjust the deformable parameters' values to match various forms of user-specified constraints. Mathematically, we formulate the constraint-based motion synthesis problem in a Maximum A Posteriori (MAP) framework by estimating the most likely deformable parameters from the user's input. We demonstrate the power and flexibility of our approach by exploring two interactive and easy-to-use interfaces for human motion generation: direct manipulation interfaces and sketching interfaces.