Long range stereo data-fusion from moving platforms

  • Authors:
  • Gaurav Sukhatme;Larry Matthies;Gabe Sibley

  • Affiliations:
  • University of Southern California;University of Southern California;University of Southern California

  • Venue:
  • Long range stereo data-fusion from moving platforms
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work is concerned with accurate and precise long-range depth estimation by filtering sequences of images. We investigate the problem with both stationary and mobile cameras. We find that with normally distributed sensor noise the non-linear properties of stereo triangulation, either from a motion baseline or from a stereo rig, can lead to statistically biased range estimates. In the static camera case we develop two novel filters to overcome this: a second order Gauss-Newton filter, and an Iterated Sigma Point Kalman filter. For the moving sensor case we develop a sliding window filter for Simultaneous Localization and Mapping that concentrates computational resources on accurately estimating the immediate spatial surroundings by using a sliding time window of the most recent sensor measurements. The sliding window method exhibits many interesting properties, like reversible data association, constant time complexity during exploration, and robust estimation across multiple timesteps. Importantly, the sliding window filter has the potential to approximate the batch estimator in terms of optimality and efficiency. To this end we demonstrate convergence results using real and simulated image data.