Robust visual odometry using uncertainty models

  • Authors:
  • David Van Hamme;Peter Veelaert;Wilfried Philips

  • Affiliations:
  • University College Ghent, Vision Systems and Ghent University/IBBT (IPI);University College Ghent, Vision Systems and Ghent University/IBBT (IPI);Ghent University/IBBT (IPI)

  • Venue:
  • ACIVS'11 Proceedings of the 13th international conference on Advanced concepts for intelligent vision systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In dense, urban environments, GPS by itself cannot be relied on to provide accurate positioning information. Signal reception issues (e.g. occlusion, multi-path effects) often prevent the GPS receiver from getting a positional lock, causing holes in the absolute positioning data. In order to keep assisting the driver, other sensors are required to track the vehicle motion during these periods of GPS disturbance. In this paper, we propose a novel method to use a single on-board consumer-grade camera to estimate the relative vehicle motion. The method is based on the tracking of ground plane features, taking into account the uncertainty on their backprojection as well as the uncertainty on the vehicle motion. A Hough-like parameter space vote is employed to extract motion parameters from the uncertainty models. The method is easy to calibrate and designed to be robust to outliers and bad feature quality. Preliminary testing shows good accuracy and reliability, with a positional estimate within 2 metres for a 400 metre elapsed distance. The effects of inaccurate calibration are examined using artificial datasets, suggesting a self-calibrating system may be possible in future work.