Generalizing motion edits with Gaussian processes

  • Authors:
  • Leslie Ikemoto;Okan Arikan;David Forsyth

  • Affiliations:
  • University of California, Berkeley, CA;University of Texas, Austin, TX;University of Illinois, Urbana-Champaign, Urbana, IL

  • Venue:
  • ACM Transactions on Graphics (TOG)
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.