Data-driven grasping with partial sensor data

  • Authors:
  • Corey Goldfeder;Matei Ciocarlie;Jaime Peretzman;Hao Dang;Peter K. Allen

  • Affiliations:
  • Dept. of Computer Science, Columbia University, NY;Dept. of Computer Science, Columbia University, NY;Dept. of Computer Science, Columbia University, NY;Dept. of Computer Science, Columbia University, NY;Dept. of Computer Science, Columbia University, NY

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

To grasp a novel object, we can index it into a database of known 3D models and use precomputed grasp data for those models to suggest a new grasp. We refer to this idea as data-driven grasping, and we have previously introduced the Columbia Grasp Database for this purpose. In this paper we demonstrate a data-driven grasp planner that requires only partial 3D data of an object in order to grasp it. To achieve this, we introduce a new shape descriptor for partial 3D range data, along with an alignment method that can rigidly register partial 3D models to models that are globally similar but not identical. Our method uses SIFT features of depth images, and encapsulates "nearby" views of an object in a compact shape descriptor.