Real time whole body motion mapping for avatars and robots

  • Authors:
  • Bernhard Spanlang;Xavi Navarro;Jean-Marie Normand;Sameer Kishore;Rodrigo Pizarro;Mel Slater

  • Affiliations:
  • Universitat de Barcelona, Spain;Universitat de Barcelona, Spain;Universitat de Barcelona, Spain and École Centrale de Nantes, France;Universitat de Barcelona, Spain;Universitat de Barcelona, Spain;Universitat de Barcelona, Spain and Institució Catalana de Recerca i Estudis Avançats, Spain

  • Venue:
  • Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a system that allows for controlling different robots and avatars from a real time motion stream. The underlying problem is that motion data from tracking systems is usually represented differently to the motion data required to drive an avatar or a robot: there may be different joints, motion may be represented by absolute joint positions and rotations or by a root position, bone lengths and relative rotations in the skeletal hierarchy. Our system resolves these issues by remapping in real time the tracked motion so that the avatar or robot performs motions that are visually close to those of the tracked person. The mapping can also be reconfigured interactively at run-time. We demonstrate the effectiveness of our system by case studies in which a tracked person is embodied as an avatar in immersive virtual reality or as a robot in a remote location. We show this with a variety of tracking systems, humanoid avatars and robots.