On-Line Robotic Interception Planning Using a Rendezvous-Guidance Technique

  • Authors:
  • Farhad Agah;Mehran Mehrandezh;Robert G. Fenton;Beno Benhabib

  • Affiliations:
  • Computer Integrated Manufacturing Laboratory, Department of Mechanical and Industrial Engineering, University of Toronto, 5 King's College Road, Toronto, Ontario, Canada, M5S 3G8;Faculty of Engineering, University of Regina, Regina, Saskatchewan, Canada, S4S 0A2;Computer Integrated Manufacturing Laboratory, Department of Mechanical and Industrial Engineering, University of Toronto, 5 King's College Road, Toronto, Ontario, Canada, M5S 3G8;Computer Integrated Manufacturing Laboratory, Department of Mechanical and Industrial Engineering, University of Toronto, 5 King's College Road, Toronto, Ontario, Canada, M5S 3G8/ e-mail: ...

  • Venue:
  • Journal of Intelligent and Robotic Systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel method for online, robotic interception of moving objects using visual feedback is proposed in this paper. No prior knowledge of the motion of the object is assumed. Since such objects might depart quickly from the workspace of the robot, fast interception is a critical issue. Thus, a novel time-optimal rendezvous-guidance technique that takes the dynamic limitations of the robot into account has been developed. In the proposed methodology, first, a parallel-navigation rule, originally introduced in the missile-guidance literature, is applied to generate a set of instantaneous task-space velocity commands, which, if executed, would keep the end-effector on a collision course with the object. Subsequently, a rendezvous-guidance method is utilized to reduce the original command set to one with velocity-matching capability. Finally, the fastest velocity command in the reduced set is chosen such that the dynamic limitations of the actuators of the robot are not violated. The proposed algorithm results in a fast and robust interception as shown by several simulation examples in 2D and 3D workspace.