Autonomous robot calibration using vision technology

  • Authors:
  • Yan Meng;Hanqi Zhuang

  • Affiliations:
  • Department of Electrical and Computer Engineering, Stevens Institute of Technology, Hoboken, NJ, USA;Department of Electrical Engineering, Florida Atlantic University, Boca Raton, FL, USA

  • Venue:
  • Robotics and Computer-Integrated Manufacturing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Unlike the traditional robot calibration methods, which need external expensive calibration apparatus and elaborate setups to measure the 3D feature points in the reference frame, a vision-based self-calibration method for a serial robot manipulator, which only requires a ground-truth scale in the reference frame, is proposed in this paper. The proposed algorithm assumes that the camera is rigidly attached to the robot end-effector, which makes it possible to obtain the pose of the manipulator with the pose of the camera. By designing a manipulator movement trajectory, the camera poses can be estimated up to a scale factor at each configuration with the factorization method, where a nonlinear least-square algorithm is applied to improve its robustness. An efficient approach is proposed to estimate this scale factor. The great advantage of this self-calibration method is that only image sequences of a calibration object and a ground-truth length are needed, which makes the robot calibration procedure more autonomous in a dynamic manufacturing environment. Simulations and experimental studies on a PUMA 560 robot reveal the convenience and effectiveness of the proposed robot self-calibration approach.