Real-Time Simultaneous Localisation and Mapping with a Single Camera

  • Authors:
  • Andrew J. Davison

  • Affiliations:
  • -

  • Venue:
  • ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Ego-motion estimation for an agile single camera movingthrough general, unknown scenes becomes a much morechallenging problem when real-time performance is requiredrather than under the off-line processing conditionsunder which most successful structure from motion workhas been achieved. This task of estimating camera motionfrom measurements of a continuously expanding set of self-mappedvisual features is one of a class of problems knownas Simultaneous Localisation and Mapping (SLAM) in therobotics community, and we argue that such real-time mappingresearch, despite rarely being camera-based, is morerelevant here than off-line structure from motion methodsdue to the more fundamental emphasis placed on propagationof uncertainty.We present a top-down Bayesian framework for single-cameralocalisation via mapping of a sparse set of naturalfeatures using motion modelling and an information-guidedactive measurement strategy, in particular addressingthe difficult issue of real-time feature initialisation viaa factored sampling approach. Real-time handling of uncertaintypermits robust localisation via the creating andactive measurement of a sparse map of landmarks such thatregions can be re-visited after periods of neglect and localisationcan continue through periods when few features arevisible. Results are presented of real-time localisation fora hand-waved camera with very sparse prior scene knowledgeand all processing carried out on a desktop PC.