Road edge tracking for robot road following: a real-time implementation
Image and Vision Computing - Special issue: frequency increase for 1991
Rapidly adapting artificial neural networks for autonomous navigation
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Recursive 3-D Road and Relative Ego-State Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence - Special issue on interpretation of 3-D scenes—part II
Visual tracking of known three-dimensional objects
International Journal of Computer Vision
Finding road lane boundaries for vision-guided vehicle navigation
Vision-based vehicle guidance
A parallel architecture for curvature-based road scene classification
Vision-based vehicle guidance
Model-based object tracking in monocular image sequences of road traffic scenes
International Journal of Computer Vision
Adaptive road parameter estimation in monocular image sequences
BMVC 94 Proceedings of the conference on British machine vision (vol. 2)
Vision-Based Vehicle Guidance
A Compact Vision System for Road Vehicle Guidance
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume III-Volume 7276 - Volume 7276
An Iconic Classification Scheme for Video-Based Traffic Sensor Tasks
CAIP '01 Proceedings of the 9th International Conference on Computer Analysis of Images and Patterns
Hi-index | 0.00 |
An approach to road recognition and ego-state tracking in monocularimage sequences of traffic scenes is described. The main contribution ofthis paper is the adaptive recognition scheme, which deals with competitiveroad hypotheses, and its application in several processing steps of an imagesequence analysis system. No manual initialization of the tracked road isrequired and the change of the road type is allowed. The road parameters tobe recognized are the road width, road lane number and road curvature. Forexact estimation of road curvature the translational and rotationalvelocities of the ego-car are assumed to be available. The estimated ego-state parameters are the camera orientation (which is derived due tovanishing point tracking) and the camera position relative to the roadcenter line.