A survey of thresholding techniques
Computer Vision, Graphics, and Image Processing
A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast and Globally Convergent Pose Estimation from Video Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Research on computer vision-based for UAV autonomous landing on a ship
Pattern Recognition Letters
A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks
Journal of Intelligent and Robotic Systems
A Vision-Based Automatic Landing Method for Fixed-Wing UAVs
Journal of Intelligent and Robotic Systems
On-board and Ground Visual Pose Estimation Techniques for UAV Control
Journal of Intelligent and Robotic Systems
Monocular-SLAM–based navigation for autonomous micro helicopters in GPS-denied environments
Journal of Field Robotics
Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV
Journal of Intelligent and Robotic Systems
Hi-index | 0.00 |
In this paper, an airborne vision-based navigation method for Unmanned Aerial Vehicle (UAV) accuracy landing is presented. In this method, a visible light camera integrated with a Digital Signal Processing (DSP) processor is installed on the UAV and a 940 nm optical filter is fixed in front of the camera lens. In addition, four infrared light-emitting diode (LED) lamps whose emission wavelengths are 940 nm are placed behind ideal landing site on the runway. In this way, the infrared lamps in the image are distinct even if the image background is complicated. In the image processing procedure, firstly maximum between-class variance algorithm and region growing algorithm are used to determine candidate infrared lamp regions in the images. Then Negative Laplacian of Gaussian (NLOG) operator is applied to detect and track centers of the infrared lamps in the images. The space position and attitude of the camera can be obtained according to the corresponding relationship between image coordinates and space coordinates of the infrared lamp centers. Finally, high precision space position of the UAV can be calculated according to the installation relationship between the camera and the UAV. Plenty of real flight and static precision experiments have proved the validity and accuracy of the proposed method.