Learning View Graphs for Robot Navigation
Autonomous Robots - Special issue on autonomous agents
Exploring artificial intelligence in the new millennium
Robot Homing by Exploiting Panoramic Vision
Autonomous Robots
Navigation in large-scale environments using an augmented model of visual homing
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
Linked Local Visual Navigation and Robustness to Motor Noise and Route Displacement
SAB '08 Proceedings of the 10th international conference on Simulation of Adaptive Behavior: From Animals to Animats
Three 2D-warping schemes for visual robot navigation
Autonomous Robots
Holistic visual encoding of ant-like routes: Navigation without waypoints
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Landmark vectors with quantized distance information for homing navigation
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Models of visually guided routes in ants: embodiment simplifies route acquisition
ICIRA'11 Proceedings of the 4th international conference on Intelligent Robotics and Applications - Volume Part II
Analyzing the effect of landmark vectors in homing navigation
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Robotics and Autonomous Systems
How active vision facilitates familiarity-based homing
Living Machines'13 Proceedings of the Second international conference on Biomimetic and Biohybrid Systems
Hi-index | 0.00 |
Insects are able to navigate reliably between food and nest using only visual information. This behavior has inspired many models of visual landmark guidance, some of which have been tested on autonomous robots. The majority of these models work by comparing the agent's current view with a view of the world stored when the agent was at the goal. The region from which agents can successfully reach home is therefore limited to the goal's visual locale, that is, the area around the goal where the visual scene is not radically different to the goal position. Ants are known to navigate over large distances using visually guided routes consisting of a series of visual memories. Taking inspiration from such route navigation, we propose a framework for linking together local navigation methods. We implement this framework on a robotic platform and test it in a series of environments in which local navigation methods fail. Finally, we show that the framework is robust to environments of varying complexity.