Sliding window filter with application to planetary landing

  • Authors:
  • Gabe Sibley;Larry Matthies;Gaurav Sukhatme

  • Affiliations:
  • Department of Computer Science, University of Southern California, Los Angeles, California 90089;Computer Vision Group, NASA Jet Propulsion Laboratory, Pasadena, California 91109;Department of Computer Science, University of Southern California, Los Angeles, California 90089

  • Venue:
  • Journal of Field Robotics - Visual Mapping and Navigation Outdoors
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We are concerned with improving the range resolution of stereo vision for entry, descent, and landing (EDL) missions to Mars and other planetary bodies. The goal is to create accurate and precise three-dimensional planetary surface-structure estimates by filtering sequences of stereo images taken from an autonomous landing vehicle. We describe a sliding window filter (SWF) approach based on delayed state marginalization. The SWF can run in constant time, yet still achieve experimental results close to those of the bundle adjustment solution. This technique can scale from the offline batch least-squares solution to fast online incremental solutions. For instance, if the window encompasses all time, the solution is equivalent to full bundle adjustment; if only one time step is maintained, the solution matches the extended Kalman filter; if poses and landmarks are slowly marginalized out over time such that the state vector ceases to grow, then the filter becomes constant time, like visual odometry. Within the constant time regime, the sliding window approach demonstrates convergence properties that are close to those of the full batch solution and strictly superior to visual odometry. Experiments with real data show that ground structure estimates follow the expected convergence pattern that is predicted by theory. These experiments indicate the effectiveness of filtering long-range stereo for EDL. © 2010 Wiley Periodicals, Inc.