Templates for pre-grasp sliding interactions

  • Authors:
  • Daniel Kappler;Lillian Y. Chang;Nancy S. Pollard;Tamim Asfour;Rüdiger Dillmann

  • Affiliations:
  • Institute for Anthropomatics, Karlsruhe Institute of Technology, Karlsruhe, Germany;Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, United States and Intel Corporation, Intel Science and Technology Center, University of Washington, Seat ...;Robotics Institute, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, United States;Institute for Anthropomatics, Karlsruhe Institute of Technology, Karlsruhe, Germany;Institute for Anthropomatics, Karlsruhe Institute of Technology, Karlsruhe, Germany

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In manipulation tasks that require object acquisition, pre-grasp interaction such as sliding adjusts the object in the environment before grasping. This change in object placement can improve grasping success by making desired grasps reachable. However, the additional sliding action prior to grasping introduces more complexity to the motion planning process, since the hand pose relative to the object does not need to remain fixed during the pre-grasp interaction. Furthermore, anthropomorphic hands in humanoid robots have several degrees of freedom that could be utilized to improve the object interaction beyond a fixed grasp shape. We present a framework for synthesizing pre-grasp interactions for high-dimensional anthropomorphic manipulators. The motion planning is tractable because information from pre-grasp manipulation examples reduces the search space to promising hand poses and shapes. In particular, we show the value of organizing the example data according to object category templates. The template information focuses the search based on the object features, resulting in increased success of adapting a template pose and decreased planning time.