Localization with multi-modal vision measurements in limited GPS environments using Gaussian sum filters

  • Authors:
  • Jonathan R. Schoenberg;Mark Campbell;Isaac Miller

  • Affiliations:
  • Sibley School of Mechanical and Aerospace Engineering, Cornell University, Ithaca, NY;Sibley School of Mechanical and Aerospace Engineering, Cornell University, Ithaca, NY;Sibley School of Mechanical and Aerospace Engineering, Cornell University, Ithaca, NY

  • Venue:
  • ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Gaussian Sum Filter (GSF) with component extended Kalman filters (EKF) is proposed as an approach to localize an autonomous vehicle in an urban environment with limited GPS availability. The GSF uses vehicle relative vision-based measurements of known map features coupled with inertial navigation solutions to accomplish localization in the absence of GPS. The vision-based measurements are shown to have multi-modal measurement likelihood functions that are well represented as a weighted sum of Gaussian densities and the GSF is ideally suited to accomplish recursive Bayesian state estimation for this problem. A sequential merging technique is used for Gaussian mixture condensation in the posterior density approximation after fusing multi-modal measurements in the GSF to maintain mixture size over time. The representation of the posterior density with the GSF is compared over a common dataset against a benchmark particle filter solution. The Expectation-Maximization (EM) algorithm is used offline to determine the representational efficiency of the particle filter in terms of an effective number of Gaussian densities. The GSF with vision-based vehicle relative measurements is shown to remain converged using 37 minutes of recorded data from the Cornell University DARPA Urban Challenge (DUC) autonomous vehicle in an urban environment that includes a 32 minute GPS blackout.