Outdoor Visual Position Estimation for Planetary Rovers

  • Authors:
  • Fabio Cozman;Eric Krotkov;Carlos Guestrin

  • Affiliations:
  • Laboratory of Automation and Systems, University of São Paulo, São Paulo, Brazil. fgcozman@usp.br;Robotics Institute, Carnegie Mellon University, Pittsburgh PA 15213, USA. krotkov@cytometrics.com;Computer Science Department, Gates Computer Science Building, Stanford University, Stanford CA 94305, USA. guestrin@stanford.edu

  • Venue:
  • Autonomous Robots
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes (1) a novel, effective algorithm for outdoor visual position estimation; (2) the implementation of this algorithm in the Viper system; and (3) the extensive tests that have demonstrated the superior accuracy and speed of the algorithm. The Viper system (iVisual iPosition iEstimator for iRovers) is geared towards robotic space missions, and the central purpose of the system is to increase the situational awareness of a rover operator by presenting accurate position estimates. The system has been extensively tested with terrestrial and lunar imagery, in terrains ranging from moderate—the rounded hills of Pittsburgh and the high deserts of Chile—to rugged—the dramatic relief of the Apollo 17 landing site—to extreme—the jagged peaks of the Rockies. Results have consistently demonstrated that the visual estimation algorithm estimates position with an accuracy and reliability that greatly surpass previous work.