Learning the stylistic similarity between human motions

  • Authors:
  • Yu-Ren Chien;Jing-Sin Liu

  • Affiliations:
  • Institute of Information Science, Academia Sinica, Taiwan;Institute of Information Science, Academia Sinica, Taiwan

  • Venue:
  • ISVC'06 Proceedings of the Second international conference on Advances in Visual Computing - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a computational model of stylistic similarity between human motions that is statistically derived from a comprehensive collection of captured, stylistically similar motion pairs. In this model, a set of hypersurfaces learned by single-class SVM and kernel PCA characterize the region occupied by stylistically similar motion pairs in the space of all possible pairs. The proposed model is further applied to a system for adapting an existing clip of human motion to a new environment, where stylistic distortion is avoided by enforcing stylistic similarity of the synthesized motion to the existing motion. The effectiveness of the system has been verified by 18 distinct adaptations, which produced walking, jumping, and running motions that exhibit the intended styles as well as the intended contact configurations.