On-Line motion style transfer

  • Authors:
  • Xiaomao Wu;Lizhuang Ma;Can Zheng;Yanyun Chen;Ke-Sen Huang

  • Affiliations:
  • Department of Computer Science & Engineering, Shanghai Jiao Tong University, Shanghai, P. R. China;Department of Computer Science & Engineering, Shanghai Jiao Tong University, Shanghai, P. R. China;Department of Computer Science & Engineering, Shanghai Jiao Tong University, Shanghai, P. R. China;Microsoft Research Asia, Beijing, P. R. China;Department of Computer Science, National Tsing Hua University, HsingChu, Taiwan R.O.C.

  • Venue:
  • ICEC'06 Proceedings of the 5th international conference on Entertainment Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Motion capture techniques play an important role in computer animation. Because the cost of motion capture data is relatively high and the virtual environment changes frequently in actual applications, researchers in this area focus their work on developing algorithms for editing the capture motion data, and synthesizing new motions from available motion database. Although abundant work has been done on motion editing and synthesis, few of them obviously take motion styles into consideration. Meanwhile, existing style editing algorithms either need an obvious definition of “style”, or need a time-consuming training process. In this paper, we propose a fast and convenient algorithm for human-motion style editing. We define the style of motion as statistic properties of mean and standard variance of joint quaternions in 4D unit sphere space. The proposed algorithm can transfer the style of a motion to another by transferring these properties. Experiment results demonstrate that our approach has the advantages of fast execution, low memory occupation, and easy implementation. It can be widely applied to various real-time entertainment-computing applications, such as gaming and digital movie producing.