A General Motion Model and Spatio-Temporal Filters forComputing Optical Flow
International Journal of Computer Vision
New Visual Invariants for Terrain Navigation Without 3DReconstruction
International Journal of Computer Vision
Controlling the Execution of a Visual Servoing Task
Journal of Intelligent and Robotic Systems
Alternative "vision": a haptic and auditory assistive device
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Genetic programming for robot vision
ICSAB Proceedings of the seventh international conference on simulation of adaptive behavior on From animals to animats
Stereo and Color Analysis for Dynamic Obstacle Avoidance
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
A General Approach for Egomotion Estimation with Omnidirectional Images
OMNIVIS '02 Proceedings of the Third Workshop on Omnidirectional Vision
Robot Navigation by Combining Central and Peripheral Optical Flow Detection on a Space-Variant Map
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 2 - Volume 2
Monocular Vision Based Obstacle Detection for Robot Navigation in Unstructured Environment
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
A modified model for the Lobula Giant Movement Detector and its FPGA implementation
Computer Vision and Image Understanding
Hi-index | 0.00 |
The lure of using motion vision as a fundamental element in the perception of space drives this effort to use flow features as the sole cues for robot mobility. Real-time estimates of image flow and flow divergence provide the robot's sense of space. The robot steers down a conceptual corridor comparing left and right peripheral flows. Large central flow divergence warns the robot of impending collisions at "dead ends." When this occurs, the robot turns around and resumes wandering. Behavior is generated by directly using flow-based information in the 2D image sequence; no 3D reconstruction is attempted. Active mechanical gate stabilization simplifies the visual interpretation problems by reducing camera rotation. By combining corridor following and dead-end deflection, the robot has wandered around the lab at 30 cm/s for as long as 20 minutes without collision. The ability to support this behavior in real-time with current equipment promises expanded capabilities as computational power increases in the future.