Qualitative vision-based path following

  • Authors:
  • Zhichao Chen;Stanley T. Birchfield

  • Affiliations:
  • Department of Electrical and Computer Engineering, Clemson University, Clemson, SC;Department of Electrical and Computer Engineering, Clemson University, Clemson, SC

  • Venue:
  • IEEE Transactions on Robotics - Special issue on rehabilitation robotics
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a simple approach for vision-based path following for a mobile robot. Based upon a novel concept called the funnel lane, the coordinates of feature points during the replay phase are compared with those obtained during the teaching phase in order to determine the turning direction. Increased robustness Is achieved by coupling the feature coordinates with odometry information. The system requires a single off-the-shelf, forward-looking camera with no calibration (either external or internal, including lens distortion). Implicit calibration of the system is needed only in the form of a single controller gain. The algorithm is qualitative in nature, requiring no map of the environment, no image Jacobian, no homography, no fundamental matrix, and no assumption about a flat ground plane. Experimental results demonstrate the capability of real-time autonomous navigation in both indoor and outdoor environments and on flat, slanted, and rough terrain with dynamic occluding objects for distances of hundreds of meters. We also demonstrate that the same approach works with wide-angle and omnidirectional cameras with only slight modification.