Kinesthetic teaching of visuomotor coordination for pointing by the humanoid robot iCub

  • Authors:
  • Andre Lemme;Ananda Freire;Guilherme Barreto;Jochen Steil

  • Affiliations:
  • Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany;Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany and Federal University of Ceará, Department of Teleinformatic ...;Federal University of Ceará, Department of Teleinformatics Engineering, Av. Mister Hull, S/N - Center of Technology, Campus of Pici, Fortaleza, Ceará, Brazil;Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld University, Universitätsstr. 25, 33615 Bielefeld, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Pointing at something refers to orienting the hand, the arm, the head or the body in the direction of an object or an event. This skill constitutes a basic communicative ability for cognitive agents like, e.g. humanoid robots. The goal of this study is to show that approximate and, in particular, precise pointing can be learned as a direct mapping from the object's pixel coordinates in the visual field to hand positions or to joint angles. This highly nonlinear mapping defines the pose and orientation of a robot's arm. The study underlines that this is possible without calculating the object's depth and 3D position explicitly since only the direction is required. To this aim, three state-of-the-art neural network paradigms (multilayer perceptron, extreme learning machine and reservoir computing) are evaluated on real world data gathered from the humanoid robot iCub. Training data are interactively generated and recorded from kinesthetic teaching for the case of precise pointing. Successful generalization is verified on the iCub using a laser pointer attached to its hand.