Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks: Research Articles

  • Authors:
  • Nikolas Trawny;Anastasios I. Mourikis;Stergios I. Roumeliotis;Andrew E. Johnson;James F. Montgomery

  • Affiliations:
  • Dept. of Computer Science & Engineering University of Minnesota Minnesota, MN 55455;Dept. of Computer Science & Engineering University of Minnesota Minnesota, MN 55455;Dept. of Computer Science & Engineering University of Minnesota Minnesota, MN 55455;Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91109;Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91109

  • Venue:
  • Journal of Field Robotics - Special Issue on Space Robotics, Part III
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we describe an extended Kalman filter algorithm for estimating the pose and velocity of a spacecraft during entry, descent, and landing. The proposed estimator combines measurements of rotational velocity and acceleration from an inertial measurement unit (IMU) with observations of a priori mapped landmarks, such as craters or other visual features, that exist on the surface of a planet. The tight coupling of inertial sensory information with visual cues results in accurate, robust state estimates available at a high bandwidth. The dimensions of the landing uncertainty ellipses achieved by the proposed algorithm are three orders of magnitude smaller than those possible when relying exclusively on IMU integration. Extensive experimental and simulation results are presented, which demonstrate the applicability of the algorithm on real-world data and analyze the dependence of its accuracy on several system design parameters. © 2007 Wiley Periodicals, Inc.