A 3D pose estimator for the visually impaired
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Modeling and Calibration of Inertial and Vision Sensors
International Journal of Robotics Research
Visual-inertial simultaneous localization, mapping and sensor-to-sensor self-calibration
CIRA'09 Proceedings of the 8th IEEE international conference on Computational intelligence in robotics and automation
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration
International Journal of Robotics Research
Geolocation of Multiple Targets from Airborne Video Without Terrain Data
Journal of Intelligent and Robotic Systems
Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation
International Journal of Robotics Research
Opportunities and challenges with autonomous micro aerial vehicles
International Journal of Robotics Research
High-precision, consistent EKF-based visual-inertial odometry
International Journal of Robotics Research
Camera-IMU-based localization: Observability analysis and consistency improvement
International Journal of Robotics Research
Hi-index | 0.00 |
Vision-aided inertial navigation systems (V-INSs) can provide precise state estimates for the 3-D motion of a vehicle when no external references (e.g., GPS) are available. This is achieved by combining inertial measurements from an inertial measurement unit (IMU) with visual observations from a camera under the assumption that the rigid transformation between the two sensors is known. Errors in the IMU-camera extrinsic calibration process cause biases that reduce the estimation accuracy and can even lead to divergence of any estimator processing the measurements from both sensors. In this paper, we present an extended Kalman filter for precisely determining the unknown transformation between a camera and an IMU. Contrary to previous approaches, we explicitly account for the time correlation of the IMU measurements and provide a figure of merit (covariance) for the estimated transformation. The proposed method does not require any special hardware (such as spin table or 3-D laser scanner) except a calibration target. Furthermore, we employ the observability rank criterion based on Lie derivatives and prove that the nonlinear system describing the IMU-camera calibration process is observable. Simulation and experimental results are presented that validate the proposed method and quantify its accuracy.