A robot system that observes and replicates grasping tasks

  • Authors:
  • Sing Bing Kang;K. Ikeuchi

  • Affiliations:
  • -;-

  • Venue:
  • ICCV '95 Proceedings of the Fifth International Conference on Computer Vision
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

To alleviate the problem of overwhelming complexity in grasp synthesis and path planning associated with robot task planning, we adopt the approach of teaching the robot by demonstrating in front of it. The system has four components: the observation system, the grasping task recognition module, the task translator and the robot system. The observation system comprises an active multibaseline stereo system and a dataglove. The data stream recorded is then used to track object motion; this paper illustrates how complimentary sensory data can be used for this purpose. The data stream is also interpreted by the grasping task recognition module, which produces higher levels of abstraction to describe both the motion and actions taken in the task. The resulting information are provided to the task translator which creates commands for the robot system to replicate the observed task. In this paper we describe how these components work with special emphasis on the observation system. The robot system that we use to perform the grasping tasks comprises the PUMA 560 arm and the Utah/MIT hand.