Omnidirectional Vision and Inertial Clues for Robot Navigation
Journal of Robotic Systems
A Flexible Software Architecture for Hybrid Tracking
Journal of Robotic Systems
Fusion of Vision and Inertial Data for Motion and Structure Estimation
Journal of Robotic Systems
Parallel Tracking and Mapping for Small AR Workspaces
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Fly-inspired visual steering of an ultralight indoor aircraft
IEEE Transactions on Robotics
Parallel tracking and mapping for controlling VTOL airframe
Journal of Control Science and Engineering
Scaled monocular SLAM for walking people
Proceedings of the 2013 International Symposium on Wearable Computers
UGV-MAV Collaboration for Augmented 2D Maps
Proceedings of Conference on Advances In Robotics
Bearing-only visual SLAM for small unmanned aerial vehicles in GPS-denied environments
International Journal of Automation and Computing
Journal of Intelligent and Robotic Systems
Hi-index | 0.00 |
The fusion of inertial and visual data is widely used to improve an object's pose estimation. However, this type of fusion is rarely used to estimate further unknowns in the visual framework. In this paper we present and compare two different approaches to estimate the unknown scale parameter in a monocular SLAM framework. Directly linked to the scale is the estimation of the object's absolute velocity and position in 3D. The first approach is a spline fitting task adapted from Jung and Taylor and the second is an extended Kalman filter. Both methods have been simulated offline on arbitrary camera paths to analyze their behavior and the quality of the resulting scale estimation. We then embedded an online multi rate extended Kalman filter in the Parallel Tracking and Mapping (PTAM) algorithm of Klein and Murray together with an inertial sensor. In this inertial/monocular SLAM framework, we show a real time, robust and fast converging scale estimation. Our approach does not depend on known patterns in the vision part nor a complex temporal synchronization between the visual and inertial sensor.