Towards semantic maps for mobile robots

  • Authors:
  • Andreas Nüchter;Joachim Hertzberg

  • Affiliations:
  • University of Osnabrück, Institute of Computer Science, Knowledge-Based Systems Research Group, Albrechtstr. 28, D-49069 Osnabrück, Germany;University of Osnabrück, Institute of Computer Science, Knowledge-Based Systems Research Group, Albrechtstr. 28, D-49069 Osnabrück, Germany

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Intelligent autonomous action in ordinary environments calls for maps. 3D geometry is generally required for avoiding collision with complex obstacles and to self-localize in six degrees of freedom (6 DoF) (x, y, z positions, roll, yaw, and pitch angles). Meaning, in addition to geometry, becomes inevitable if the robot is supposed to interact with its environment in a goal-directed way. A semantic stance enables the robot to reason about objects; it helps disambiguate or round off sensor data; and the robot knowledge becomes reviewable and communicable. The paper describes an approach and an integrated robot system for semantic mapping. The prime sensor is a 3D laser scanner. Individual scans are registered into a coherent 3D geometry map by 6D SLAM. Coarse scene features (e.g., walls, floors in a building) are determined by semantic labeling. More delicate objects are then detected by a trained classifier and localized. In the end, the semantic maps can be visualized for human inspection. We sketch the overall architecture of the approach, explain the respective steps and their underlying algorithms, give examples based on a working robot implementation, and discuss the findings.