Simultaneous Motion and Structure Estimation by Fusion of Inertial and Vision Data

  • Authors:
  • Peter Gemeiner;Peter Einramhof;Markus Vincze

  • Affiliations:
  • Automation and Control Institute Vienna University ofTechnology Gusshausstrasse 27-29/376, 1040 Vienna, Austria;Automation and Control Institute Vienna University ofTechnology Gusshausstrasse 27-29/376, 1040 Vienna, Austria;Automation and Control Institute Vienna University ofTechnology Gusshausstrasse 27-29/376, 1040 Vienna, Austria

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

For mobile robotics, head gear in augmented reality (AR) applications or computer vision, it is essential to continuously estimate the egomotion and the structure of the environment. This paper presents the system developed in the SmartTracking project, which simultaneously integrates visual and inertial sensors in a combined estimation scheme. The sparse structure estimation is based on the detection of corner features in the environment. From a single known starting position, the system can move into an unknown environment. The vision and inertial data are fused, and the performance of both Unscented Kalman filter and Extended Kalman filter are compared for this task. The filters are designed to handle asynchronous input from visual and inertial sensors, which typically operate at different and possibly varying rates. Additionally, a bank of Extended Kalman filters, one per corner feature, is used to estimate the position and the quality of structure points and to include them into the structure estimation process. The system is demonstrated on a mobile robot executing known motions, such that the estimation of the egomotion in an unknown environment can be compared to ground truth.