Autonomous Robotic Inspection and Manipulation Using Multisensor Feedback

  • Authors:
  • Mongi A. Abidi;Richard O. Eason;Rafael C. Gonzalez

  • Affiliations:
  • Univ. of Tennessee, Knoxville;Univ. of Tennessee, Knoxville;Univ. of Tennessee, Knoxville

  • Venue:
  • Computer - Special issue on instruction sequencing
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

A six-degree-of-freedom industrial robot to which was added a number of sensors-vision, range, sound, proximity, force/torque, and touch-to enhance its inspection and manipulation capabilities is described. The work falls under the scope of partial autonomy. In teleoperation mode, the human operator prepares the robotic system to perform the desired task. Using its sensory cues, the system maps the workspace and performs its operations in a fully autonomous mode. Finally, the system reports back to the human operator on the success or failure of the task and resumes its teleoperation mode. The feasibility of realistic autonomous robotic inspection and manipulation tasks using multisensory information cues is demonstrated. The focus is on the estimation of the three-dimensional position and orientation of the task panel and the use of other nonvision sensors for valve manipulation. The experiment illustrates the need for multisensory information to accomplish complex, autonomous robotic inspection and manipulation tasks.