Vision Based Navigation Algorithm for Autonomic Landing of UAV without Heading & Attitude Sensors

  • Authors:
  • Tang Daquan;Zhang Hongyue

  • Affiliations:
  • -;-

  • Venue:
  • SITIS '07 Proceedings of the 2007 Third International IEEE Conference on Signal-Image Technologies and Internet-Based System
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

A navigation algorithm completely based on machine vision for the autonomic landing of UAV without heading and attitude sensors is presented. The image of an airport runway lighting acquired by the airborne camera is determined by the aircraft’s attitude, heading and position relative to the runway. The image gradients of the centerline and threshold bar of runway lighting, the lognitudinal mean and the lateral mean of the image coordinates of the observed airport lights, etc., can be calculated and used as the measurements in a extended Kalman filter. The Kalman filter then generates the estimates of the aircraft’s motion parameters, including position and velocity relative to the ground, and attitude, heading and rotating rate. The simulation results indicate that the navigation algorithm meet the navigation accuracy requirements for various FAA categories of landing.