Real-time obstacle avoidance for manipulators and mobile robots
International Journal of Robotics Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Performance of optical flow techniques
International Journal of Computer Vision
Visual learning and recognition of 3-D objects from appearance
International Journal of Computer Vision
Motion analysis by random sampling and voting process
Computer Vision and Image Understanding
Vision for Mobile Robot Navigation: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dominant plane detection from optical flow for robot navigation
Pattern Recognition Letters
Omnidirectional Vision and Invariant Theory for Robot Navigation Using Conformal Geometric Algebra
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
Efficient Monocular 3D Reconstruction from Segments for Visual Navigation in Structured Environments
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
Robot Navigation by Panoramic Vision and Attention Guided Fetaures
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
Corridor Navigation and Obstacle Avoidance using Visual Potential for Mobile Robot
CRV '07 Proceedings of the Fourth Canadian Conference on Computer and Robot Vision
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
Appearance based recognition methodology for recognising fingerspelling alphabets
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Mobile robot navigation in 2-D dynamic environments using an electrostatic potential field
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
In this paper, we develop an algorithm for navigating a mobile robot using the visual potential. The visual potential is computed from an image sequence and optical flow computed from successive images captured by a camera mounted on the robot, that is, the visual potential for navigation is computed from appearances of the workspace observed as an image sequence. The direction to the destination is provided at the initial position of the robot. The robot dynamically selects a local pathway to the destination without collision with obstacles and without any knowledge of the robot workspace. Furthermore, the guidance algorithm to destination allows the mobile robot to return from the destination to the initial position. We present the experimental results of navigation and homing in synthetic and real environments.