Vision-Based Road-Following Using Proportional Navigation

  • Authors:
  • Ryan S. Holt;Randal W. Beard

  • Affiliations:
  • Lincoln Laboratory, Massachusetts Institute of Technology, Lexington, USA 02420;Electrical and Computer Engineering Department, Brigham Young University, Provo, USA 84602

  • Venue:
  • Journal of Intelligent and Robotic Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a new approach for autonomous road following for an unmanned air vehicle (UAV) using a visual sensor. A road is defined as any continuous, extended, curvilinear feature, which can include city streets, highways, and dirt roads, as well as forest-fire perimeters, shorelines, and fenced borders. To achieve autonomous road-following, this paper utilizes Proportional Navigation as the basis for the guidance law, where visual information is directly fed back into the controller. The tracking target for the Proportional Navigation algorithm is chosen as the position on the edge of the camera frame at which the road flows into the image. Therefore, each frame in the video stream only needs to be searched on the edge of the frame, thereby significantly reducing the computational requirements of the computer vision algorithms. The tracking error defined in the camera reference frame shows that the Proportional Navigation guidance law results in a steady-state error caused by bends and turns in the road, which are perceived as road motion. The guidance algorithm is therefore adjusted using Augmented Proportional Navigation Guidance to account for the perceived road accelerations and to force the steady-state error to zero. The effectiveness of the solution is demonstrated through high-fidelity simulations, and with flight tests using a small autonomous UAV.