Homography-based 2D Visual Tracking and Servoing
International Journal of Robotics Research
Speeded-Up Robust Features (SURF)
Computer Vision and Image Understanding
Vision-Based Odometry and SLAM for Medium and High Altitude Flying UAVs
Journal of Intelligent and Robotic Systems
Improving the Agility of Keyframe-Based SLAM
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part II
Parallel Tracking and Mapping for Small AR Workspaces
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Journal of Intelligent and Robotic Systems
Video-rate localization in multiple maps for wearable augmented reality
ISWC '08 Proceedings of the 2008 12th IEEE International Symposium on Wearable Computers
Object recognition and localization while tracking and mapping
ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
Parallel Tracking and Mapping on a camera phone
ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM
Journal of Intelligent and Robotic Systems
Machine learning for high-speed corner detection
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Hi-index | 0.00 |
This work presents a vision based system for navigation on a vertical takeoff and landing unmanned aerial vehicle (UAV). This is a monocular vision based, simultaneous localization and mapping (SLAM) system, which measures the position and orientation of the camera and builds a map of the environment using a video stream from a single camera. This is different from past SLAM solutions on UAV which use sensors that measure depth, like LIDAR, stereoscopic cameras or depth cameras. Solution presented in this paper extends and significantly modifies a recent open-source algorithm that solves SLAM problem using approach fundamentally different from a traditional approach. Proposed modifications provide the position measurements necessary for the navigation solution on a UAV. The main contributions of this work include: (1) extension of the map building algorithm to enable it to be used realistically while controlling a UAV and simultaneously building the map; (2) improved performance of the SLAM algorithm for lower camera frame rates; and (3) the first known demonstration of a monocular SLAM algorithm successfully controlling a UAV while simultaneously building the map. This work demonstrates that a fully autonomous UAV that uses monocular vision for navigation is feasible.