Matrix computations (3rd ed.)
Structure from Motion Causally Integrated Over Time
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Invitation to 3-D Vision: From Images to Geometric Models
An Invitation to 3-D Vision: From Images to Geometric Models
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Relative Pose Calibration Between Visual and Inertial Sensors
International Journal of Robotics Research
Relative pose calibration of a spherical camera and an IMU
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
On Kalman Filtering With Nonlinear Equality Constraints
IEEE Transactions on Signal Processing
IEEE Transactions on Robotics
Hi-index | 0.00 |
Visual and inertial sensors, in combination, are well-suited for many robot navigation and mapping tasks. However, correct data fusion, and hence overall system performance, depends on accurate calibration of the 6-DOF transform between the sensors (one or more camera(s) and an inertial measurement unit). Obtaining this calibration information is typically difficult and time-consuming. In this paper, we describe an algorithm, based on the unscented Kalman filter (UKF), for camera-IMU simultaneous localization, mapping and sensor relative pose self-calibration. We show that the sensor-to-sensor transform, the IMU gyroscope and accelerometer biases, the local gravity vector, and the metric scene structure can all be recovered from camera and IMU measurements alone. This is possible without any prior knowledge about the environment in which the robot is operating. We present results from experiments with a monocular camera and a low-cost solid-state IMU, which demonstrate accurate estimation of the calibration parameters and the local scene structure.