Animating non-humanoid characters with human motion data

  • Authors:
  • Katsu Yamane;Yuka Ariki;Jessica Hodgins

  • Affiliations:
  • Disney Research, Pittsburgh and Carnegie Mellon University;Disney Research, Pittsburgh and Nara Institute of Science and Technology, Japan;Carnegie Mellon University and Disney Research, Pittsburgh

  • Venue:
  • Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a method for generating animations of non-humanoid characters from human motion capture data. Characters considered in this work have proportion and/or topology significantly different from humans, but are expected to convey expressions and emotions through body language that are understandable to human viewers. Keyframing is most commonly used to animate such characters. Our method provides an alternative for animating non-humanoid characters that leverages motion data from a human subject performing in the style of the target character. The method consists of a statistical mapping function learned from a small set of corresponding key poses, and a physics-based optimization process to improve the physical realism. We demonstrate our approach on three characters and a variety of motions with emotional expressions.