High resolution terrain mapping using low altitude aerial stereo imagery
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Hierarchical SLAM: Real-Time Accurate Mapping of Large Environments
IEEE Transactions on Robotics
An Introduction to Inertial and Visual Sensing
International Journal of Robotics Research
Journal of Intelligent and Robotic Systems
Using the marginalised particle filter for real-time visual-inertial sensor fusion
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Flyphone: Visual Self-Localisation Using a Mobile Phone as Onboard Image Processor on a Quadrocopter
Journal of Intelligent and Robotic Systems
Vision-based unmanned aerial vehicle navigation using geo-referenced information
EURASIP Journal on Advances in Signal Processing - Special issue on signal processing advances in robots and autonomy
A visual navigation system for autonomous flight of micro air vehicles
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
High-precision, consistent EKF-based visual-inertial odometry
International Journal of Robotics Research
Bearing-only visual SLAM for small unmanned aerial vehicles in GPS-denied environments
International Journal of Automation and Computing
Camera-IMU-based localization: Observability analysis and consistency improvement
International Journal of Robotics Research
Closed-Form Solution of Visual-Inertial Structure from Motion
International Journal of Computer Vision
Hi-index | 0.00 |
This paper addresses some challenges to the real-time implementation of Simultaneous Localisation and Mapping (SLAM) on a UAV platform. When compared to the implementation of SLAM in 2D environments, airborne implementation imposes several difficulties in terms of computational complexity and loop closure, with high nonlinearity in both vehicle dynamics and observations. An implementation of airborne SLAM is formulated to relieve this computational complexity in both direct and indirect ways. Our implementation is based on an Extended Kalman Filter (EKF), which fuses data from an Inertial Measurement Unit (IMU) with data from a passive vision system. Real-time results from flight trials are provided.