Terrain-based road vehicle localization on multi-lane highways
ACC'09 Proceedings of the 2009 conference on American Control Conference
Comparison of sensor fusion methods for an SMA-based hexapod biomimetic robot
Robotics and Autonomous Systems
Horizon Profile Detection for Attitude Determination
Journal of Intelligent and Robotic Systems
Hi-index | 0.00 |
A novel method for estimating vehicle roll, pitch, and yaw using machine vision and inertial sensors is presented that is based on matching images captured from an on-vehicle camera to a rendered representation of the surrounding terrain obtained from a three-dimensional (3D) terrain map. U.S. Geographical Survey Digital Elevation Maps were used to create a 3D topology map of the geography surrounding the vehicle, and it is assumed in this work that large segments of the surrounding terrain are visible, particularly the horizon lines. The horizon lines seen in the captured video from the vehicle are compared to the horizon lines obtained from a rendered geography, allowing absolute comparisons between rendered and actual scene in roll, pitch, and yaw. A kinematic Kalman filter modeling an inertial navigation system then uses the scene matching to generate filtered estimates of orientation. Numerical simulations verify the performance of the Kalman filter. Experiments using an instrumented vehicle operating at the test track of the Pennsylvania Transportation Institute were performed to check the validity of the method. When compared to estimates from a global positioning system-inertial measurement unit (IMU) system, the roll, pitch, and yaw estimates from vision-IMU Kalman filter show an agreement with a (2σ) bound of 0.5, 0.26, and 0.8 deg, respectively. © 2008 Wiley Periodicals, Inc.