Online simulation of emotional interactive behaviors with hierarchical Gaussian process dynamical models

  • Authors:
  • Nick Taubert;Andrea Christensen;Dominik Endres;Martin A. Giese

  • Affiliations:
  • University Clinic Tübingen, Tübingen, Germany;University Clinic Tübingen, Tübingen, Germany;University Clinic Tübingen, Tübingen, Germany;University Clinic Tübingen, Tübingen, Germany

  • Venue:
  • Proceedings of the ACM Symposium on Applied Perception
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The online synthesis of stylized interactive movements with high levels of realism is a difficult problem in computer graphics. We present a new approach for the learning of structured dynamical models for the synthesis of interactive body movements that is based on hierarchical Gaussian process latent variable models. The latent spaces of this model encode postural manifolds and the dependency between the postures of the interacting characters. In addition, our model includes dimensions representing emotional style variations (for neutral, happy, angry, sad) and individually-specific motion style. The dynamics of the state in the latent space is modeled by a Gaussian Process Dynamical Model, a probabilistic dynamical model that can learn to generate arbitrary smooth trajectories in real-time. The proposed framework offers a large degree of flexibility, in terms of the definition of the model structure as well as the complexity of the learned motion trajectories. In order to assess the suitability of the proposed framework for the generation of highly realistic motion, we performed a 'Turing test': a psychophysical study where human observers classified the emotions and rated the naturalness of the generated and natural emotional handshakes. Classification results for both stimulus groups were not significantly different, and for all emotional styles, except for neutral, participants rated the synthesized handshakes equally natural as animations with the original trajectories. This shows that the proposed method generates highly-realistic interactive movements that are almost indistinguishable from natural ones. As a further extension, we demonstrate the capability of the method to interpolate between different emotional styles.