A Theory of Single-Viewpoint Catadioptric Image Formation
International Journal of Computer Vision
International Journal of Computer Vision - Special issue on image-based servoing
ZNCC-based template matching using bounded partial correlation
Pattern Recognition Letters
Homography-based 2D Visual Tracking and Servoing
International Journal of Robotics Research
Monocular Vision for Mobile Robot Localization and Autonomous Navigation
International Journal of Computer Vision
Image-based Visual Servoing with Central Catadioptric Cameras
International Journal of Robotics Research
A mapping and localization framework for scalable appearance-based navigation
Computer Vision and Image Understanding
Qualitative vision-based path following
IEEE Transactions on Robotics - Special issue on rehabilitation robotics
Autonomous navigation of vehicles from a visual memory using a generic camera model
IEEE Transactions on Intelligent Transportation Systems
Coarsely calibrated visual servoing of a mobile robot using a catadioptric vision system
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Omnidirectional visual control of mobile robots based on the 1D trifocal tensor
Robotics and Autonomous Systems
Omnidirectional image processing using geodesic metric
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Visual Servoing via Advanced Numerical Methods
Visual Servoing via Advanced Numerical Methods
Decoupled image-based visual servoing for cameras obeying the unified projection model
IEEE Transactions on Robotics
Image moments: a general and useful set of features for visual servoing
IEEE Transactions on Robotics
IEEE Transactions on Robotics
Mutual Information-Based Visual Servoing
IEEE Transactions on Robotics
Hi-index | 0.00 |
2D visual servoing consists in using data provided by a vision sensor for controlling the motions of a dynamic system. Most of visual servoing approaches has relied on the geometric features that have to be tracked and matched in the image acquired by the camera. Recent works have highlighted the interest of taking into account the photometric information of the entire image. This approach was tackled with images of perspective cameras. We propose, in this paper, to extend this technique to central cameras. This generalization allows to apply this kind of method to catadioptric cameras and wide field of view cameras. Several experiments have been successfully done with a fisheye camera in order to control a 6 degrees of freedom robot and with a catadioptric camera for a mobile robot navigation task.