Efficient data association for view based SLAM using connected dominating sets
Robotics and Autonomous Systems
Wide-angle Visual Feature Matching for Outdoor Localization
International Journal of Robotics Research
Novel solutions for Global Urban Localization
Robotics and Autonomous Systems
Hybrid robot control and SLAM for persistent navigation and mapping
Robotics and Autonomous Systems
Persistent Navigation and Mapping using a Biologically Inspired SLAM System
International Journal of Robotics Research
From neurons to robots: towards efficient biologically inspired filtering and SLAM
KI'10 Proceedings of the 33rd annual German conference on Advances in artificial intelligence
Appearance-only SLAM at large scale with FAB-MAP 2.0
International Journal of Robotics Research
Learning to close loops from range data
International Journal of Robotics Research
A Bayesian approach for place recognition
Robotics and Autonomous Systems
CAT-SLAM: probabilistic localisation and mapping using a continuous appearance-based trajectory
International Journal of Robotics Research
A pure vision-based topological SLAM system
International Journal of Robotics Research
Topological map induction using neighbourhood information of places
Autonomous Robots
OpenRatSLAM: an open source brain-based SLAM system
Autonomous Robots
A review of path planning and mapping technologies for autonomous mobile robot systems
Proceedings of the 5th ACM COMPUTE Conference: Intelligent & scalable system technologies
Vision-based place recognition: how low can you go?
International Journal of Robotics Research
Biological models for active vision: towards a unified architecture
ICVS'13 Proceedings of the 9th international conference on Computer Vision Systems
Hi-index | 0.00 |
This paper describes a biologically inspired approach to vision-only simultaneous localization and mapping (SLAM) on ground-based platforms. The core SLAM system, dubbed RatSLAM, is based on computational models of the rodent hippocampus, and is coupled with a lightweight vision system that provides odometry and appearance information. RatSLAM builds a map in an online manner, driving loop closure and relocalization through sequences of familiar visual scenes. Visual ambiguity is managed by maintaining multiple competing vehicle pose estimates, while cumulative errors in odometry are corrected after loop closure by a map correction algorithm. We demonstrate the mapping performance of the system on a 66 km car journey through a complex suburban road network. Using only a web camera operating at 10 Hz, RatSLAM generates a coherent map of the entire environment at real-time speed, correctly closing more than 51 loops of up to 5 km in length.