Fourier principles for emotion-based human figure animation
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
A Hierarchical Latent Variable Model for Data Visualization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning in graphical models
Physically based motion transformation
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Training Hidden Markov Models with Multiple Observations-A Combinatorial Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
SIGGRAPH '88 Proceedings of the 15th annual conference on Computer graphics and interactive techniques
Motion texture: a two-level statistical model for character motion synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Verbs and Adverbs: Multidimensional Motion Interpolation
IEEE Computer Graphics and Applications
Interpolation Synthesis for Articulated Figure Motion
VRAIS '97 Proceedings of the 1997 Virtual Reality Annual International Symposium (VRAIS '97)
Efficient synthesis of physically valid human motion
ACM SIGGRAPH 2003 Papers
Synthesizing physically realistic human motion in low-dimensional, behavior-specific spaces
ACM SIGGRAPH 2004 Papers
Style-based inverse kinematics
ACM SIGGRAPH 2004 Papers
Adaptation of performed ballistic motion
ACM Transactions on Graphics (TOG)
Performance animation from low-dimensional control signals
ACM SIGGRAPH 2005 Papers
Geostatistical motion interpolation
ACM SIGGRAPH 2005 Papers
Style translation for human motion
ACM SIGGRAPH 2005 Papers
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
Hierarchical Gaussian process latent variable models
Proceedings of the 24th international conference on Machine learning
Multifactor Gaussian process models for style-content separation
Proceedings of the 24th international conference on Machine learning
Gaussian Process Dynamical Models for Human Motion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evaluating the emotional content of human motions on real and virtual characters
Proceedings of the 5th symposium on Applied perception in graphics and visualization
Animating responsive characters with dynamic constraints in near-unactuated coordinates
ACM SIGGRAPH Asia 2008 papers
Generalizing motion edits with Gaussian processes
ACM Transactions on Graphics (TOG)
Investigating the role of body shape on the perception of emotion
ACM Transactions on Applied Perception (TAP)
Modeling spatial and temporal variation in motion data
ACM SIGGRAPH Asia 2009 papers
Emulating human observers with bayesian binning: Segmentation of action streams
ACM Transactions on Applied Perception (TAP)
KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence
Walk with me: interactions in emotional walking situations, a pilot study
Proceedings of the ACM Symposium on Applied Perception
Proceedings of the ACM Symposium on Applied Perception
Hybrid motion graph for character motion synthesis
Journal of Visual Languages and Computing
Hi-index | 0.00 |
The online synthesis of stylized interactive movements with high levels of realism is a difficult problem in computer graphics. We present a new approach for the learning of structured dynamical models for the synthesis of interactive body movements that is based on hierarchical Gaussian process latent variable models. The latent spaces of this model encode postural manifolds and the dependency between the postures of the interacting characters. In addition, our model includes dimensions representing emotional style variations (for neutral, happy, angry, sad) and individually-specific motion style. The dynamics of the state in the latent space is modeled by a Gaussian Process Dynamical Model, a probabilistic dynamical model that can learn to generate arbitrary smooth trajectories in real-time. The proposed framework offers a large degree of flexibility, in terms of the definition of the model structure as well as the complexity of the learned motion trajectories. In order to assess the suitability of the proposed framework for the generation of highly realistic motion, we performed a 'Turing test': a psychophysical study where human observers classified the emotions and rated the naturalness of the generated and natural emotional handshakes. Classification results for both stimulus groups were not significantly different, and for all emotional styles, except for neutral, participants rated the synthesized handshakes equally natural as animations with the original trajectories. This shows that the proposed method generates highly-realistic interactive movements that are almost indistinguishable from natural ones. As a further extension, we demonstrate the capability of the method to interpolate between different emotional styles.