Self-location from monocular uncalibrated vision using reference omniviews

  • Authors:
  • L. Puig;J. J. Guerrero

  • Affiliations:
  • DIIS, Universidad de Zaragoza, Zaragoza, Spain;DIIS, Universidad de Zaragoza, Zaragoza, Spain

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a novel approach to perform indoor self-localization using reference omnidirectional images. We only need one omnidirectional image of the whole scene stored in the robot memory and a conventional uncalibrated on-board camera. We match the omnidirectional image and the conventional images captured by the on-board camera and compute the hybrid epipolar geometry using lifted coordinates and robust techniques. We map the epipole in the reference omnidirectional image to a ground plane through a homography in lifted coordinates also, giving the position of the robot in the planar ground, and its uncertainty. We perform experiments with simulated and real data to show the feasibility of this new self-localization approach.