High-Resolution Terrain Map from Multiple Sensor Data
IEEE Transactions on Pattern Analysis and Machine Intelligence - Special issue on interpretation of 3-D scenes—part II
Extending the Path-Planning Horizon
International Journal of Robotics Research
Monocular Vision Based Obstacle Detection for Robot Navigation in Unstructured Environment
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
A Vision System of Hazard Cameras for Lunar Rover BH2
ICIRA '09 Proceedings of the 2nd International Conference on Intelligent Robotics and Applications
Camera Models and Fundamental Concepts Used in Geometric Computer Vision
Foundations and Trends® in Computer Graphics and Vision
A qualitative path planner for robot navigation using human-provided maps
International Journal of Robotics Research
Hi-index | 0.00 |
Autonomous navigation of a rover on Mars surface can improve very significantly the daily traverse, particularly when driving away from the lander, into unknown areas. The autonomous navigation process developed at CNES is based on stereo cameras perception, used to build a model of the environment and generate trajectories. Multiple perception merging with propagation of the locomotion and localization errors have been implemented. The algorithms developed for Mars exploration programs, the vision hardware, the validation tools, experimental platforms and results of evaluation are presented. Portability and the evaluation of computing resources for implementation on a Mars rover are also addressed. The results show that the implementation of autonomy requires only a very small amount of energy and computing time and that the rover capabilities are fully used, allowing a much longer daily traverse than what is enabled by purely ground-planned strategies.