Real-Time Localisation and Mapping with Wearable Active Vision

  • Authors:
  • Andrew J. Davison;Walterio W. Mayol;David W. Murray

  • Affiliations:
  • -;-;-

  • Venue:
  • ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a general method for real-time, vision-onlysingle-camera simultaneous localisation and mapping(SLAM) - an algorithm which is applicable to the localisationof any camera moving through a scene - and studyits application to the localisation of a wearable robot withactive vision. Starting from very sparse initial scene knowledge,a map of natural point features spanning a section ofa room is generated on-the-fly as the motion of the camera issimultaneously estimated in full 3D. Naturally this permitsthe annotation of the scene with rigidly-registered graphics,but further it permits automatic control of the robot's activecamera: for instance, fixation on a particular object can bemaintained during extended periods of arbitrary user motion,then shifted at will to another object which has potentiallybeen out of the field of view. This kind of functionalityis the key to the understanding or "management" of aworkspace which the robot needs to have in order to assistits wearer usefully in tasks. We believe that the techniquesand technology developed are of particular immediate valuein scenarios of remote collaboration, where a remote expertis able to annotate, through the robot, the environment thewearer is working in.