Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Motion texture: a two-level statistical model for character motion synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive control of avatars animated with human motion data
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Verbs and Adverbs: Multidimensional Motion Interpolation
IEEE Computer Graphics and Applications
Style-based inverse kinematics
ACM SIGGRAPH 2004 Papers
Modeling and Learning Contact Dynamics in Human Motion
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Learning physics-based motion style with nonlinear inverse optimization
ACM SIGGRAPH 2005 Papers
Style translation for human motion
ACM SIGGRAPH 2005 Papers
Separating Style and Content with Bilinear Models
Neural Computation
GI '06 Proceedings of Graphics Interface 2006
A fast learning algorithm for deep belief nets
Neural Computation
Multifactor Gaussian process models for style-content separation
Proceedings of the 24th international conference on Machine learning
Factored conditional restricted Boltzmann Machines for modeling motion style
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Modeling spatial and temporal variation in motion data
ACM SIGGRAPH Asia 2009 papers
How to train your avatar: a data driven approach to gesture generation
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Performatology: a procedural acting approach for interactive drama in cinematic games
ICIDS'11 Proceedings of the 4th international conference on Interactive Digital Storytelling
IVA'12 Proceedings of the 12th international conference on Intelligent Virtual Agents
Hi-index | 0.00 |
Creating a virtual character that exhibits realistic physical behaviors requires a rich set of animations. To mimic the variety as well as the subtlety of human behavior, we may need to animate not only a wide range of behaviors but also variations of the same type of behavior influenced by the environment and the state of the character, including the emotional and physiological state. A general approach to this challenge is to gather a set of animations produced by artists or motion capture. However, this approach can be extremely costly in time and effort. In this work, we propose a model that can learn styled motion generation and an algorithm that produce new styles of motions via style interpolation. The model takes a set of styled motions as training samples and creates new motions that are the generalization among the given styles. Our style interpolation algorithm can blend together motions with distinct styles, and improves on the performance of previous work. We verify our algorithm using walking motions of different styles, and the experimental results show that our method is significantly better than previous work.