Vision-Aided Outdoor Navigation of an Autonomous Horticultural Vehicle

  • Authors:
  • B. Southall;T. Hague;J. A. Marchant;Bernard F. Buxton

  • Affiliations:
  • -;-;-;-

  • Venue:
  • ICVS '99 Proceedings of the First International Conference on Computer Vision Systems
  • Year:
  • 1999

Quantified Score

Hi-index 0.01

Visualization

Abstract

An autonomous outdoor vehicle has been developed at the Silsoe Research Institute as a testbed to investigate precise crop protection. The vehicle is able to navigate along rows of crop by using a Kalman filter to fuse information from proprioceptive sensing (odometry and inertial sensors) with data from an on-board computer vision system to generate estimates of its position and orientation. This paper describes a novel implementation of a previously proposed vision algorithm which uses a model of the crop planting pattern to extract vehicle position and orientation information from observations of many plants in each image. It is demonstrated that by implementing the vision system to compress the multiple plant observations into a single "pseudo observation" of vehicle position and orientation, it is possible to separate the vision system from the main body of the vehicle navigation Kalman filter, thus simplifying the task of fusing data from different sources. The algorithm is also used to segment the image sequences into areas of crop and weed, thus providing potential for targeting treatment. The implementation is tested on the vehicle, and results are shown from trials both in an indoor test area and outdoors on a field of real crop. Segmentation results are given for images captured from the vehicle.