Interactive segmentation for manipulation in unstructured environments

  • Authors:
  • Jacqueline Kenney;Thomas Buckley;Oliver Brock

  • Affiliations:
  • Robotics and Biology Laboratory, Department of Computer Science, University of Massachusetts, Amherst;Robotics and Biology Laboratory, Department of Computer Science, University of Massachusetts, Amherst;Robotics and Biology Laboratory, Department of Computer Science, University of Massachusetts, Amherst

  • Venue:
  • ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

To perform successful manipulation, robots depend on information about objects in their environment. In unstructured environments, such information cannot be given to the robot a priori. It is thus critical for the robot to be able to continuously acquire task-specific information about objects. Towards this goal, we present a robust perceptual skill for identifying, tracking, and segmenting objects in a cluttered environment. We increase the robot's perceptual capabilities by closely coupling them with the robot's manipulation skills. The robot's interaction with objects in the environment creates a perceptual signal, i.e. motion, that renders segmentation and tracking robust and reliable. In addition, the resulting perceptual signal reveals the type of segmentation most relevant to manipulation, namely a segmentation of rigidly connected physical bodies. We demonstrate our approach with experiments on a real world mobile manipulation platform with multiple objects in a cluttered scene.