Acquiring and validating motion qualities from live limb gestures

  • Authors:
  • Liwei Zhao;Norman I. Badler

  • Affiliations:
  • Center for Human Modeling and Simulation, Department of Computer and Information Science, University of Pennsylvania, Philadelphia, PA;Center for Human Modeling and Simulation, Department of Computer and Information Science, University of Pennsylvania, Philadelphia, PA

  • Venue:
  • Graphical Models
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a neural computing model that can automatically extract motion qualities from live performance. The motion qualities are in terms of laban movement analysis (LMA) Effort factors. The model inputs both 3D motion capture and 2D video projections. The output is a classification of motion qualities that are detected in the input. The neural nets are trained with professional LMA notators to ensure valid analysis and have achieved an accuracy of about 90% in motion quality recognition. The combination of this system with the EMOTE motion synthesis system provides a capability for automating both observation and analysis processes, to produce natural gestures for embodied communicative agents.