Laser-based navigation enhanced with 3D time-of-flight data

  • Authors:
  • Fang Yuan;Agnes Swadzba;Roland Philippsen;Orhan Engin;Marc Hanheide;Sven Wachsmuth

  • Affiliations:
  • Applied Informatics, Bielefeld University, Bielefeld, Germany;Applied Informatics, Bielefeld University, Bielefeld, Germany;Robotics and Artificial Intelligence Lab, Stanford University, Stanford;Applied Informatics, Bielefeld University, Bielefeld, Germany;Applied Informatics, Bielefeld University, Bielefeld, Germany;Applied Informatics, Bielefeld University, Bielefeld, Germany

  • Venue:
  • ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Navigation and obstacle avoidance in robotics using planar laser scans has matured over the last decades. They basically enable robots to penetrate highly dynamic and populated spaces, such as people's home, and move around smoothly. However, in an unconstrained environment the two-dimensional perceptual space of a fixed mounted laser is not sufficient to ensure safe navigation. In this paper, we present an approach that pools a fast and reliable motion generation approach with modern 3D capturing techniques using a Time-of-Flight camera. Instead of attempting to implement full 3D motion control, which is computationally more expensive and simply not needed for the targeted scenario of a domestic robot, we introduce a "virtual laser". For the originally solely laserbased motion generation the technique of fusing real laser measurements and 3D point clouds into a continuous data stream is 100% compatible and transparent. The paper covers the general concept, the necessary extrinsic calibration of two very different types of sensors, and exemplarily illustrates the benefit which is to avoid obstacles not being perceivable in the original laser scan.