On-line motion blending for real-time locomotion generation: Research Articles

  • Authors:
  • Sang Il Park;Hyun Joon Shin;Tae Hoon Kim;Sung Yong Shin

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Computer Animation and Virtual Worlds - Special Issue: The Very Best Papers from CASA 2004
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present an integrated framework of on-line motion blending for locomotion generation. We first provide a novel scheme for incremental timewarping, which always guarantees that the time goes forward. Combining the idea of motion blending with that of posture rearrangement, we introduce a motion transition graph to address on-line motion blending and transition simultaneously. Guided by a stream of motion specifications, our motion synthesis scheme moves from node to node in an on-line manner while blending a motion at a node and generating a transition motion at an edge. For smooth on-line motion transition, we also attach a set of example transition motions to an edge. To represent similar postures consistently, we exploit the inter-frame coherency embedded in the input motion specification. Finally, we provide a comprehensive solution to on-line motion retargeting by integrating existing techniques. Copyright © 2004 John Wiley & Sons, Ltd.