Digital image processing
Review and analysis of solutions of the three point perspective pose estimation problem
International Journal of Computer Vision
Multiple view geometry in computer vision
Multiple view geometry in computer vision
Linear Pose Estimation from Points or Lines
IEEE Transactions on Pattern Analysis and Machine Intelligence
Lucas-Kanade 20 Years On: A Unifying Framework
International Journal of Computer Vision
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Epipolar Constraints for Vision-Aided Inertial Navigation
WACV-MOTION '05 Proceedings of the IEEE Workshop on Motion and Video Computing (WACV/MOTION'05) - Volume 2 - Volume 02
Motion estimation from image and inertial measurements
Motion estimation from image and inertial measurements
Journal of Field Robotics - Special Issue on Space Robotics, Part III
Design Through Operation of an Image-Based Velocity Estimation System for Mars Landing
International Journal of Computer Vision
Exactly Sparse Delayed-State Filters for View-Based SLAM
IEEE Transactions on Robotics
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration
International Journal of Robotics Research
Vision-based absolute navigation for descent and landing
Journal of Field Robotics
Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems
Journal of Field Robotics
Opportunities and challenges with autonomous micro aerial vehicles
International Journal of Robotics Research
Hi-index | 0.00 |
In this paper, we present the vision-aided inertial navigation (VISINAV) algorithm that enables precision planetary landing. The vision front-end of the VISINAV system extracts 2-D-to- 3-D correspondences between descent images and a surface map (mapped landmarks), as well as 2-D-to-2-D feature tracks through a sequence of descent images (opportunistic features). An extended Kalman filter (EKF) tightly integrates both types of visual feature observations with measurements from an inertial measurement unit. The filter computes accurate estimates of the lander's terrain-relative position, attitude, and velocity, in a resource-adaptive and hence real-time capable fashion. In addition to the technical analysis of the algorithm, the paper presents validation results from a sounding-rocket test flight, showing estimation errors of only 0.16 m/s for velocity and 6.4 m for position at touchdown. These results vastly improve current state of the art for terminal descent navigation without visual updates, and meet the requirements of future planetary exploration missions.