Interactive motion modeling and parameterization by direct demonstration

  • Authors:
  • Carlo Camporesi;Yazhou Huang;Marcelo Kallmann

  • Affiliations:
  • University of California, Merced;University of California, Merced;University of California, Merced

  • Venue:
  • IVA'10 Proceedings of the 10th international conference on Intelligent virtual agents
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

While interactive virtual humans are becoming widely used in education, training and therapeutic applications, building animations which are both realistic and parameterized in respect to a given scenario remains a complex and time-consuming task. In order to improve this situation, we propose a framework based on the direct demonstration and parameterization of motions. The presented approach addresses three important aspects of the problem in an integrated fashion: (1) our framework relies on an interactive real-time motion capture interface that empowers non-skilled animators with the ability to model realistic upper-body actions and gestures by direct demonstration; (2) our interface also accounts for the interactive definition of clustered example motions, in order to well represent the variations of interest for a given motion being modeled; and (3) we also present an inverse blending optimization technique which solves the problem of precisely parameterizing a cluster of example motions in respect to arbitrary spatial constraints. The optimization is efficiently solved online, allowing autonomous virtual humans to precisely perform learned actions and gestures in respect to arbitrarily given targets. Our proposed framework has been implemented in an immersive multi-tile stereo visualization system, achieving a powerful and intuitive interface for programming generic parameterized motions by demonstration.