Introduction to algorithms
Learning metric-topological maps for indoor mobile robot navigation
Artificial Intelligence
Insect-inspired robotic homing
Adaptive Behavior
Monte Carlo localization: efficient position estimation for mobile robots
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Robust Monte Carlo localization for mobile robots
Artificial Intelligence
Vision for Mobile Robot Navigation: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning View Graphs for Robot Navigation
Autonomous Robots - Special issue on autonomous agents
Feature Transfer and Matching in Disparate Stereo Views through the Use of Plane Homographies
IEEE Transactions on Pattern Analysis and Machine Intelligence
Omnidirectional Vision for Appearance-Based Robot Localization
Revised Papers from the International Workshop on Sensor Based Intelligent Robots
Omni-Directional Vision for Robot Navigation
OMNIVIS '00 Proceedings of the IEEE Workshop on Omnidirectional Vision
Good Features to Track
Visual Homing: Surfing on the Epipoles
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Fusion of laser and visual data for robot motion planning and collision avoidance
Machine Vision and Applications
Active mobile robot localization
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Monocular Vision for Mobile Robot Localization and Autonomous Navigation
International Journal of Computer Vision
Linked Local Navigation for Visual Route Guidance
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Proceedings of the 30th DAGM symposium on Pattern Recognition
THE SPATIAL SEMANTIC HIERARCHY IMPLEMENTED WITH AN OMNIDIRECTIONAL VISION SYSTEM
Cybernetics and Systems
Visual homing for undulatory robotic locomotion
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Coarsely calibrated visual servoing of a mobile robot using a catadioptric vision system
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Visual navigation with a time-independent varying reference
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Three 2D-warping schemes for visual robot navigation
Autonomous Robots
Vision-based exponential stabilization of mobile robots
Autonomous Robots
A fast robot homing approach using sparse image waypoints
Image and Vision Computing
Hi-index | 0.00 |
We propose a novel, vision-based method for robot homing, the problem of computing a route so that a robot can return to its initial "home" position after the execution of an arbitrary "prior" path. The method assumes that the robot tracks visual features in panoramic views of the environment that it acquires as it moves. By exploiting only angular information regarding the tracked features, a local control strategy moves the robot between two positions, provided that there are at least three features that can be matched in the panoramas acquired at these positions. The strategy is successful when certain geometric constraints on the configuration of the two positions relative to the features are fulfilled. In order to achieve long-range homing, the features' trajectories are organized in a visual memory during the execution of the "prior" path. When homing is initiated, the robot selects Milestone Positions (MPs) on the "prior" path by exploiting information in its visual memory. The MP selection process aims at picking positions that guarantee the success of the local control strategy between two consecutive MPs. The sequential visit of successive MPs successfully guides the robot even if the visual context in the "home" position is radically different from the visual context at the position where homing was initiated. Experimental results from a prototype implementation of the method demonstrate that homing can be achieved with high accuracy, independent of the distance traveled by the robot. The contribution of this work is that it shows how a complex navigational task such as homing can be accomplished efficiently, robustly and in real-time by exploiting primitive visual cues. Such cues carry implicit information regarding the 3D structure of the environment. Thus, the computation of explicit range information and the existence of a geometric map are not required.