Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation

  • Authors:
  • Iván F. Mondragón;Pascual Campoy;Carol Martinez;Miguel Olivares

  • Affiliations:
  • Computer Vision Group, Universidad Politécnica de Madrid, C. José Gutiérrez Abascal 2. 28006 Madrid, Spain;Computer Vision Group, Universidad Politécnica de Madrid, C. José Gutiérrez Abascal 2. 28006 Madrid, Spain;Computer Vision Group, Universidad Politécnica de Madrid, C. José Gutiérrez Abascal 2. 28006 Madrid, Spain;Computer Vision Group, Universidad Politécnica de Madrid, C. José Gutiérrez Abascal 2. 28006 Madrid, Spain

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper presents an aircraft attitude and heading estimator using catadioptric images as a principal sensor for UAV or as a redundant system for IMU (Inertial Measure Unit) and gyro sensors. First, we explain how the unified theory for central catadioptric cameras is used for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV's attitude. Then, we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Finally the tests and results using the UAV COLIBRI platform and the validation of them in real flights are presented, comparing the estimated data with the inertial values measured on board.