Classification and translation of style and affect in human motion using RBF neural networks

  • Authors:
  • S. Ali Etemad;Ali Arya

  • Affiliations:
  • -;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

Human motion can be carried out with a variety of different affects or styles such as happy, sad, energetic, and tired among many others. Modeling and classifying these styles, and more importantly, translating them from one sequence onto another has become a popular problem in the fields of graphics, multimedia, and human computer interaction. In this paper, radial basis functions (RBF) are used to model and extract stylistic and affective features from motion data. We demonstrate that using only a few basis functions per degree of freedom, successful modeling of styles in cycles of human walk can be achieved. Furthermore, we employ an ensemble of RBF neural networks to learn the affective/stylistic features following time warping and principal component analysis. The system learns the components and classifies stylistic motion sequences into distinct affective and stylistic classes. The system also utilizes the ensemble of neural networks to learn motion affects and styles such that it can translate them onto neutral input sequences. Experimental results along with both numerical and perceptual validations confirm the highly accurate and effective performance of the system.