Linked Local Navigation for Visual Route Guidance

  • Authors:
  • Lincoln Smith;Andrew Philippides;Paul Graham;Bart Baddeley;Philip Husbands

  • Affiliations:
  • Centre for Computational Neuroscience and Robotics,Department of Informatics, University of Sussex, UK;Centre for Computational Neuroscience and Robotics,Department of Informatics, University of Sussex, UK;Centre for Computational Neuroscience and Robotics,Department of Informatics, University of Sussex, UK;Centre for Computational Neuroscience and Robotics,Department of Informatics, University of Sussex, UK;Centre for Computational Neuroscience and Robotics,Department of Informatics, University of Sussex, UK

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Insects are able to navigate reliably between food and nest using only visual information. This behavior has inspired many models of visual landmark guidance, some of which have been tested on autonomous robots. The majority of these models work by comparing the agent's current view with a view of the world stored when the agent was at the goal. The region from which agents can successfully reach home is therefore limited to the goal's visual locale, that is, the area around the goal where the visual scene is not radically different to the goal position. Ants are known to navigate over large distances using visually guided routes consisting of a series of visual memories. Taking inspiration from such route navigation, we propose a framework for linking together local navigation methods. We implement this framework on a robotic platform and test it in a series of environments in which local navigation methods fail. Finally, we show that the framework is robust to environments of varying complexity.