Mixed 2D/3D perception for autonomous robots in unstructured environments

  • Authors:
  • Johannes Pellenz;Frank Neuhaus;Denis Dillenberger;David Gossow;Dietrich Paulus

  • Affiliations:
  • Bundeswehr Technical Center for Engineer and General Field Equipment, Koblenz, Germany;University of Koblenz-Landau, Koblenz, Germany;University of Koblenz-Landau, Koblenz, Germany;University of Koblenz-Landau, Koblenz, Germany;University of Koblenz-Landau, Koblenz, Germany

  • Venue:
  • RoboCup 2010
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Autonomous robots in real world applications have to deal with a complex 3D environment, but are often equipped with standard 2D laser range finders (LRF) only. By using the 2D LRF for both, the 2D localization and mapping (which can be done efficiently and precisely) and for the 3D obstacle detection (which makes the robot move safely), a completely autonomous robot can be built with affordable 2D LRFs. We use the 2D LRF to perform particle filter based SLAM to generate a 2D occupancy grid, and the same LRF (moved by two servo motors) to acquire 3D scans to detect obstacles not visible in the 2D scans. The 3D data is analyzed with a recursive principal component analysis (PCA) based method, and the detected obstacles are recorded in a separate obstacle map. This obstacle map and the occupancy map are merged for the path planning. Our solution was tested on our mobile system Robbie during the RoboCup Rescue competitions in 2008 and 2009, winning the mapping challenge at the world championship 2008 and the German Open in 2009. This shows that the benefit of a sensor can dramatically be increased by actively controlling it, and that mixed 2D/3D perception can efficiently be achieved with a standard 2D sensor by controlling it actively.