Mkm: A global framework for animating humans in virtual reality applications

  • Authors:
  • Franck Multon;Richard Kulpa;Benoit Bideau

  • Affiliations:
  • M2S University Rennes, 2 Avenue Charles Tillon, 35044 Rennes, France and Bunraku Project IRISA, Campus de Beaulieu, 35042 Rennes, France;-;M2S University Rennes, 2 Avenue Charles Tillon, 35044 Rennes, France

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual humans are more and more used in VR applications, but their animation is still a challenge, especially if complex tasks must be carried out in interaction with the user. In many applications with virtual humans, credible virtual characters play a major role in presence. Motion editing techniques assume that the natural laws are intrinsically encoded in prerecorded trajectories and that modifications may preserve these natural laws, leading to credible autonomous actors. However, a complete knowledge of all the constraints is required to ensure continuity or to synchronize and blend several actions necessary to achieve a given task. We propose a framework capable of performing these tasks in an interactive environment that can change at each frame, depending on the user's orders. This framework enables VR applications to animate from dozens of characters in real time for complex constraints, to hundreds of characters if only ground adaptation is performed. It offers the following capabilities: motion synchronization, blending, retargeting, and adaptation thanks to enhanced inverse kinetics and kinematics solver. To evaluate this framework, we have compared the motor behavior of subjects in real and in virtual environments.