Active vision
Artificial Vision for Mobile Robots: Stereo Vision and Multisensory Perception
Artificial Vision for Mobile Robots: Stereo Vision and Multisensory Perception
Simultaneous Localization and Map-Building Using Active Vision
IEEE Transactions on Pattern Analysis and Machine Intelligence
Active visual navigation using non-metric structure
ICCV '95 Proceedings of the Fifth International Conference on Computer Vision
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Real-Time Simultaneous Localisation and Mapping with a Single Camera
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Real-Time Visual Workspace Localisation and Mapping for a Wearable Robot
ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
A Balanced Approach to 3D Tracking from Image Streams
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
Lens Model Selection for a Markerless AR Tracking System
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
Implicit 3D modeling and tracking for anywhere augmentation
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
Summarizing Image/Surface Registration for 6DOF Robot/Camera Pose Estimation
IbPRIA '07 Proceedings of the 3rd Iberian conference on Pattern Recognition and Image Analysis, Part II
Viewpoint stabilization for live collaborative video augmentations
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Ninja on a Plane: Automatic Discovery of Physical Planes for Augmented Reality Using Visual SLAM
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Parallel Tracking and Mapping for Small AR Workspaces
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Semi-automatic Annotations in Unknown Environments
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
An appearance-based visual compass for mobile robots
Robotics and Autonomous Systems
OutlinAR: an assisted interactive model building system with reduced computational effort
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Pose tracking from natural features on mobile phones
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
On the choice and placement of wearable vision sensors
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Avoiding moving outliers in visual SLAM by tracking moving objects
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
IEEE Transactions on Multimedia
Stereo vision local map alignment for robot environment mapping
RobVis'08 Proceedings of the 2nd international conference on Robot vision
Action selection for single-camera SLAM
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Wide-area augmented reality using camera tracking and mapping in multiple regions
Computer Vision and Image Understanding
Mobile Augmented Reality: A topometric system for wide area augmented reality
Computers and Graphics
Visual mapping and multi-modal localisation for anywhere AR authoring
ACCV'10 Proceedings of the 2010 international conference on Computer vision - Volume part II
Journal of Real-Time Image Processing
A harris-like scale invariant feature detector
ACCV'09 Proceedings of the 9th Asian conference on Computer Vision - Volume Part II
Full 6DOF pose estimation from geo-located images
ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part III
Hi-index | 0.00 |
We present a general method for real-time, vision-onlysingle-camera simultaneous localisation and mapping(SLAM) - an algorithm which is applicable to the localisationof any camera moving through a scene - and studyits application to the localisation of a wearable robot withactive vision. Starting from very sparse initial scene knowledge,a map of natural point features spanning a section ofa room is generated on-the-fly as the motion of the camera issimultaneously estimated in full 3D. Naturally this permitsthe annotation of the scene with rigidly-registered graphics,but further it permits automatic control of the robot's activecamera: for instance, fixation on a particular object can bemaintained during extended periods of arbitrary user motion,then shifted at will to another object which has potentiallybeen out of the field of view. This kind of functionalityis the key to the understanding or "management" of aworkspace which the robot needs to have in order to assistits wearer usefully in tasks. We believe that the techniquesand technology developed are of particular immediate valuein scenarios of remote collaboration, where a remote expertis able to annotate, through the robot, the environment thewearer is working in.