Parallel tracking and mapping for controlling VTOL airframe

  • Authors:
  • Michal Jama;Dale Schinstock

  • Affiliations:
  • Department of Electrical and Computer Engineering, Kansas State University, Manhattan, KS;Department of Mechanical and Nuclear Engineering, Kansas State University, Manhattan, KS

  • Venue:
  • Journal of Control Science and Engineering
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work presents a vision based system for navigation on a vertical takeoff and landing unmanned aerial vehicle (UAV). This is a monocular vision based, simultaneous localization and mapping (SLAM) system, which measures the position and orientation of the camera and builds a map of the environment using a video stream from a single camera. This is different from past SLAM solutions on UAV which use sensors that measure depth, like LIDAR, stereoscopic cameras or depth cameras. Solution presented in this paper extends and significantly modifies a recent open-source algorithm that solves SLAM problem using approach fundamentally different from a traditional approach. Proposed modifications provide the position measurements necessary for the navigation solution on a UAV. The main contributions of this work include: (1) extension of the map building algorithm to enable it to be used realistically while controlling a UAV and simultaneously building the map; (2) improved performance of the SLAM algorithm for lower camera frame rates; and (3) the first known demonstration of a monocular SLAM algorithm successfully controlling a UAV while simultaneously building the map. This work demonstrates that a fully autonomous UAV that uses monocular vision for navigation is feasible.