Omni-Directional Vision for Robot Navigation

  • Authors:
  • Niall Winters;José Gaspar;Gerard Lacey;José Santos-Victor

  • Affiliations:
  • -;-;-;-

  • Venue:
  • OMNIVIS '00 Proceedings of the IEEE Workshop on Omnidirectional Vision
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a method for visual based robot navigation with a single omni-directional (catadioptric) camera. We show how omni-directional images can be used to generate the representations needed for two main navigation modalities: Topological Navigation and Visual Path Following.Topological navigation relies on the robot's qualitative global position, estimated from a set of omni-directional images obtained during a training stage (compressed using PCA). To deal with illumination changes, an eigenspace approximation to the Hausdorff measure is exploited. We present a method to transform omni-directional images to Bird's Eye Views that correspond to scaled orthographic views of the ground plane. These images are used to locally control the orientation of the robot, through visual servoing.Visual Path Following is used to accurately control the robot along a prescribed trajectory, by using bird's eye views to track landmarks on the ground plane. Due to the simplified geometry of these images, the robot's pose can be estimated easily and used for accurate trajectory following.Omni-directional images facilitate landmark-based navigation, since landmarks remain visible in all images, as opposed to a small field-of-view standard camera. In addition, omni-directional images provide the means of having adequate representations to support both accurate and qualitative navigation. Results are described in the paper.