VRPN: a device-independent, network-transparent VR peripheral system
VRST '01 Proceedings of the ACM symposium on Virtual reality software and technology
OpenTracker-An Open Software Architecture for Reconfigurable Tracking based on XML
VR '01 Proceedings of the Virtual Reality 2001 Conference (VR'01)
Ubiquitous Tracking for Augmented Reality
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Skinning with dual quaternions
Proceedings of the 2007 symposium on Interactive 3D graphics and games
Real-time motion retargeting to highly varied user-created morphologies
ACM SIGGRAPH 2008 papers
Comparing and evaluating real time character engines for virtual environments
Presence: Teleoperators and Virtual Environments
A flexible framework for wide-spectrum vr development
Presence: Teleoperators and Virtual Environments
Drumming in Immersive Virtual Reality: The Body Shapes the Way We Play
IEEE Transactions on Visualization and Computer Graphics
Hi-index | 0.00 |
We describe a system that allows for controlling different robots and avatars from a real time motion stream. The underlying problem is that motion data from tracking systems is usually represented differently to the motion data required to drive an avatar or a robot: there may be different joints, motion may be represented by absolute joint positions and rotations or by a root position, bone lengths and relative rotations in the skeletal hierarchy. Our system resolves these issues by remapping in real time the tracked motion so that the avatar or robot performs motions that are visually close to those of the tracked person. The mapping can also be reconfigured interactively at run-time. We demonstrate the effectiveness of our system by case studies in which a tracked person is embodied as an avatar in immersive virtual reality or as a robot in a remote location. We show this with a variety of tracking systems, humanoid avatars and robots.