Visual motion capturing for kinematic model estimation of a humanoid robot

  • Authors:
  • Andre Gaschler

  • Affiliations:
  • Technische Universität München, Germany

  • Venue:
  • DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Controlling a tendon-driven robot like the humanoid Ecce is a difficult task, even more so when its kinematics and its pose are not known precisely. In this paper, we present a visual motion capture system to allow both real-time measurements of robot joint angles and model estimation of its kinematics. Unlike other humanoid robots, Ecce (see Fig. 1A) is completely molded by hand and its joints are not equipped with angle sensors. This anthropomimetic robot design [5] demands for both (i) real-time measurement of joint angles and (ii) model estimation of its kinematics. The underlying principle of this work is that all kinematic model parameters can be derived from visual motion data. Joint angle data finally lay the foundation for physics-based simulation and control of this novel musculoskeletal robot.