Real-Time Simultaneous Localisation and Mapping with a Single Camera
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Parallel Tracking and Mapping for Small AR Workspaces
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks
Journal of Intelligent and Robotic Systems
Towards a navigation system for autonomous indoor flying
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Combining monoSLAM with object recognition for scene augmentation using a wearable camera
Image and Vision Computing
Two novel real-time local visual features for omnidirectional vision
Pattern Recognition
BRIEF: binary robust independent elementary features
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part IV
Monocular-SLAM–based navigation for autonomous micro helicopters in GPS-denied environments
Journal of Field Robotics
Machine learning for high-speed corner detection
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Trajectory generation and control for precise aggressive maneuvers with quadrotors
International Journal of Robotics Research
ORB: An efficient alternative to SIFT or SURF
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
Journal of Intelligent and Robotic Systems
Hi-index | 0.00 |
This paper presents a novel solution for micro aerial vehicles (MAVs) to autonomously search for and land on an arbitrary landing site using real-time monocular vision. The autonomous MAV is provided with only one single reference image of the landing site with an unknown size before initiating this task. We extend a well-known monocular visual SLAM algorithm that enables autonomous navigation of the MAV in unknown environments, in order to search for such landing sites. Furthermore, a multi-scale ORB feature based method is implemented and integrated into the SLAM framework for landing site detection. We use a RANSAC-based method to locate the landing site within the map of the SLAM system, taking advantage of those map points associated with the detected landing site. We demonstrate the efficiency of the presented vision system in autonomous flights, both indoor and in challenging outdoor environment.