On multi-rate fusion for non-linear sampled-data systems: Application to a 6D tracking system

  • Authors:
  • L. Armesto;J. Tornero;M. Vincze

  • Affiliations:
  • Department of Systems and Control Engineering, Technical University of Valencia, Camino de Vera s/n 46022, Valencia, Spain;Department of Systems and Control Engineering, Technical University of Valencia, Camino de Vera s/n 46022, Valencia, Spain;Automation and Control Institute, Vienna University of Technology, Gusshausstr. 27.29/361, A-1040, Vienna, Austria

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Egomotion estimation, e.g. for robot navigation or augmented reality applications, requires the fusion of non-linear sampled-data system with different sensors. An example is to fuse the complimentary characteristics of visual and inertial sensors. Existing approaches either use Kalman filters in conventionally sampled systems or use Particle filters to accommodate the uncertainty of motion models. This paper introduces an approach that models multi-rate non-linear systems to exploit the characteristics of both sensors, assuming synchronicity and periodicity of measurements. The final contribution of this paper is an in-depth analysis and performance comparison of the Extended Kalman filter, the Unscented Kalman filter and three particle filters (Bootstrap, Extended and Unscented). While there is large debate over the pros and cons of these two approaches, this work shows the following results for fusing visual and inertial data in 6 DOF (position and orientation) in a tracking application: the Bootstrap Particle filter gives higher estimation error than Extended and Unscented Particle filters, which give very similar results than Extended and Unscented Kalman filters, but with considerable higher computational burden.