1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry

  • Authors:
  • Javier Civera;Oscar G. Grasa;Andrew J. Davison;J. M. M. Montiel

  • Affiliations:
  • Robotics, Perception and Real-Time Group, Universidad de Zaragoza, Zaragoza 50018, Spain;Robotics, Perception and Real-Time Group, Universidad de Zaragoza, Zaragoza 50018, Spain;Department of Computing, Imperial College, London SW7 2AZ, United Kingdom;Robotics, Perception and Real-Time Group, Universidad de Zaragoza, Zaragoza 50018, Spain

  • Venue:
  • Journal of Field Robotics - Visual Mapping and Navigation Outdoors
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Random sample consensus (RANSAC) has become one of the most successful techniques for robust estimation from a data set that may contain outliers. It works by constructing model hypotheses from random minimal data subsets and evaluating their validity from the support of the whole data. In this paper we present a novel combination of RANSAC plus extended Kalman filter (EKF) that uses the available prior probabilistic information from the EKF in the RANSAC model hypothesize stage. This allows the minimal sample size to be reduced to one, resulting in large computational savings without the loss of discriminative power. 1-Point RANSAC is shown to outperform both in accuracy and computational cost the joint compatibility branch and bound (JCBB) algorithm, a gold-standard technique for spurious rejection within the EKF framework. Two visual estimation scenarios are used in the experiments: first, six-degree-of-freedom (DOF) motion estimation from a monocular sequence (structure from motion). Here, a new method for benchmarking six-DOF visual estimation algorithms based on the use of high-resolution images is presented, validated, and used to show the superiority of 1-point RANSAC. Second, we demonstrate long-term robot trajectory estimation combining monocular vision and wheel odometry (visual odometry). Here, a comparison against global positioning system shows an accuracy comparable to state-of-the-art visual odometry methods. © 2010 Wiley Periodicals, Inc.