Rigid 3D geometry matching for grasping of known objects in cluttered scenes

  • Authors:
  • Chavdar Papazov;Sami Haddadin;Sven Parusel;Kai Krieger;Darius Burschka

  • Affiliations:
  • Robotics and Embedded Systems, Technische Universität München (TUM), Garching, Germany;Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany;Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany;Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Wessling, Germany;Robotics and Embedded Systems, Technische Universität München (TUM), Garching, Germany

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present an efficient 3D object recognition and pose estimation approach for grasping procedures in cluttered and occluded environments. In contrast to common appearance-based approaches, we rely solely on 3D geometry information. Our method is based on a robust geometric descriptor, a hashing technique and an efficient, localized RANSAC-like sampling strategy. We assume that each object is represented by a model consisting of a set of points with corresponding surface normals. Our method simultaneously recognizes multiple model instances and estimates their pose in the scene. A variety of tests shows that the proposed method performs well on noisy, cluttered and unsegmented range scans in which only small parts of the objects are visible. The main procedure of the algorithm has a linear time complexity resulting in a high recognition speed which allows a direct integration of the method into a continuous manipulation task. The experimental validation with a seven-degree-of-freedom Cartesian impedance controlled robot shows how the method can be used for grasping objects from a complex random stack. This application demonstrates how the integration of computer vision and soft-robotics leads to a robotic system capable of acting in unstructured and occluded environments.