Example-based control of human motion

  • Authors:
  • Eugene Hsu;Sommer Gentry;Jovan Popović

  • Affiliations:
  • Computer Science and Artificial Intelligence Laboratory;Massachusetts Institute of Technology;Computer Science and Artificial Intelligence Laboratory

  • Venue:
  • SCA '04 Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In human motion control applications, the mapping between a control specification and an appropriate target motion often defies an explicit encoding. We present a method that allows such a mapping to be defined by example, given that the control specification is recorded motion. Our method begins by building a database of semantically meaningful instances of the mapping, each of which is represented by synchronized segments of control and target motion. A dynamic programming algorithm can then be used to interpret an input control specification in terms of mapping instances. This interpretation induces a sequence of target segments from the database, which is concatenated to create the appropriate target motion. We evaluate our method on two examples of indirect control. In the first, we synthesize a walking human character that follows a sampled trajectory. In the second, we generate a synthetic partner for a dancer whose motion is acquired through motion capture.