Robotic grasping and manipulation through human visuomotor learning

  • Authors:
  • Brian Moore;Erhan Oztop

  • Affiliations:
  • ATR Cognitive Mechanisms Laboratories, Kyoto, Japan and Laval University, Department of Mechanical Engineering, Québec, Canada;NICT Biological ICT Group, Kyoto, Japan and ATR Cognitive Mechanisms Laboratories, Kyoto, Japan and Ozyegin University, Istanbul, Turkey

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A major goal of robotics research is to develop techniques that allow non-experts to teach robots dexterous skills. In this paper, we report our progress on the development of a framework which exploits human sensorimotor learning capability to address this aim. The idea is to place the human operator in the robot control loop where he/she can intuitively control the robot, and by practice, learn to perform the target task with the robot. Subsequently, by analyzing the robot control obtained by the human, it is possible to design a controller that allows the robot to autonomously perform the task. First, we introduce this framework with the ball-swapping task where a robot hand has to swap the position of the balls without dropping them, and present new analyses investigating the intrinsic dimension of the ball-swapping skill obtained through this framework. Then, we present new experiments toward obtaining an autonomous grasp controller on an anthropomorphic robot. In the experiments, the operator directly controls the (simulated) robot using visual feedback to achieve robust grasping with the robot. The data collected is then analyzed for inferring the grasping strategy discovered by the human operator. Finally, a method to generalize grasping actions using the collected data is presented, which allows the robot to autonomously generate grasping actions for different orientations of the target object.