An autonomous agent navigating with a polarized light compass
Adaptive Behavior
Learning View Graphs for Robot Navigation
Autonomous Robots - Special issue on autonomous agents
An HVSM for improving the homing ability of visual robots
International Journal of Intelligent Systems Technologies and Applications
Three 2D-warping schemes for visual robot navigation
Autonomous Robots
Landmark vectors with quantized distance information for homing navigation
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Analyzing the effect of landmark vectors in homing navigation
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Hi-index | 0.00 |
In this article, a minimalistic model for learning and adaptation of visual homing is presented. Normalized Hebbian learning is used during exploration tours of a mobile robot to learn visual homing and to adapt to the sensory modalities. The sensors of the mobile robot (omnidirectional camera, magnetic compass) have been chosen in a way that their data most closely resemble the sensory data at the disposal of insects such as the desert ant Cataglyphis (almost omnidirectional vision, polarized light compass), which is an amazing navigator despite its tiny brain. The learned homing mechanism turned out to be closely related to Lambrinos and colleagues' average landmark vector (ALV) model and is widely independent of any special features of the environment. In contrast to the ALV model or other models of visual homing, feature extraction or landmark segmentation is not necessary. Mobile robot experiments have been performed in an unmodified office environment to test the feasibility of learning of visual homing.