A motion capture-based control-space approach for walking mannequins: Research Articles

  • Authors:
  • Julien Pettre;Jean-Paul Laumond

  • Affiliations:
  • -;LAAS-CNRS, 7, avenue du Colonel Roche, 31077 Toulouse, France.

  • Venue:
  • Computer Animation and Virtual Worlds
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual mannequins need to navigate in order to interact with their environment. Their autonomy to accomplish navigation tasks is ensured by locomotion controllers. Control inputs can be user-defined or automatically computed to achieve high-level operations (e.g. obstacle avoidance). This paper presents a locomotion controller based on a motion capture edition technique. Controller inputs are the instantaneous linear and angular velocities of the walk. Our solution works in real time and supports at any time continuous changes of inputs. The controller combines three main components to synthesize locomotion animations in a four-stage process. First, the Motion Library stores motion capture samples. Motion captures are analysed to compute quantitative characteristics. Second, these characteristics are represented in a linear control space. This geometric representation is appropriate for selecting and weighting three motion samples with respect to the input state. Third, locomotion cycles are synthesized by blending the selected motion samples. Blending is done in the frequency domain. Lastly, successive postures are extracted from the synthesized cycles in order to complete the animation of the moving mannequin. The method is demonstrated in this paper in a locomotion-planning context. Copyright © 2006 John Wiley & Sons, Ltd.