Real-Time Simultaneous Localisation and Mapping with a Single Camera
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Local visual homing by matched-filter descent in image distances
Biological Cybernetics
MonoSLAM: Real-Time Single Camera SLAM
IEEE Transactions on Pattern Analysis and Machine Intelligence
Depth, contrast and view-based homing in outdoor scenes
Biological Cybernetics
Short and long-range visual navigation using warped panoramic images
Robotics and Autonomous Systems
Vision-based global localization for mobile robots with hybrid maps of objects and spatial layouts
Information Sciences: an International Journal
Efficient visual tracking using particle filter with incremental likelihood calculation
Information Sciences: an International Journal
Burrow-centric distance-estimation methods inspired by surveillance behavior of fiddler crabs
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
A comparison of EKF and SGD applied to a view-based SLAM approach with omnidirectional images
Robotics and Autonomous Systems
Hi-index | 0.07 |
Many insects and animals exploit their own navigation systems to navigate in space. Biologically-inspired methods have been introduced for landmark-based navigation algorithms of a mobile robot. The methods determine the movement direction based on a home snapshot image and another snapshot from the current position. In this paper, we suggest a new landmark-based matching method for robotic homing navigation that first computes the distance to each landmark based on ego-motion and estimates the landmark arrangement in the snapshot image. Then, landmark vectors are used to localize the robotic agent in the environment and to choose the appropriate direction to return home. As a result, this method has a higher success rate for returning home from an arbitrary position than do the conventional image-matching algorithms.