A nonparametric learning approach to range sensing from omnidirectional vision

  • Authors:
  • Christian Plagemann;Cyrill Stachniss;Jürgen Hess;Felix Endres;Nathan Franklin

  • Affiliations:
  • Stanford University, Computer Science Department, 353 Serra Mall, Stanford, CA 94305-9010, United States;University of Freiburg, Department of CS, Georges-Koehler-Allee 79, 79110 Freiburg, Germany;University of Freiburg, Department of CS, Georges-Koehler-Allee 79, 79110 Freiburg, Germany;University of Freiburg, Department of CS, Georges-Koehler-Allee 79, 79110 Freiburg, Germany;University of Freiburg, Department of CS, Georges-Koehler-Allee 79, 79110 Freiburg, Germany

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel approach to estimating depth from single omnidirectional camera images by learning the relationship between visual features and range measurements available during a training phase. Our model not only yields the most likely distance to obstacles in all directions, but also the predictive uncertainties for these estimates. This information can be utilized by a mobile robot to build an occupancy grid map of the environment or to avoid obstacles during exploration-tasks that typically require dedicated proximity sensors such as laser range finders or sonars. We show in this paper how an omnidirectional camera can be used as an alternative to such range sensors. As the learning engine, we apply Gaussian processes, a nonparametric approach to function regression, as well as a recently developed extension for dealing with input-dependent noise. In practical experiments carried out in different indoor environments with a mobile robot equipped with an omnidirectional camera system, we demonstrate that our system is able to estimate range with an accuracy comparable to that of dedicated sensors based on sonar or infrared light.