Real-time control of attention and behavior in a logical framework
AGENTS '97 Proceedings of the first international conference on Autonomous agents
What Tasks can be Performed with an Uncalibrated Stereo Vision System?
International Journal of Computer Vision
Grounding Mundane Inference in Perception
Autonomous Robots - Special issue on autonomous agents
Shape Morphing-Based Control of Robotic Visual Servoing
Autonomous Robots
Calibration-Free Augmented Reality
IEEE Transactions on Visualization and Computer Graphics
Calibration-Free Augmented Reality in Perspective
IEEE Transactions on Visualization and Computer Graphics
Evaluation of Model Independent Image-Based Visual Servoing
CRV '04 Proceedings of the 1st Canadian Conference on Computer and Robot Vision
Precise Positioning of Binocular Eye-to-Hand Robotic Manipulators
Journal of Intelligent and Robotic Systems
Optical touch screen with virtual force
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Calibrating an air-ground control system from motion correspondences
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
Much of the previous work on hand-eye coordination has emphasized the reconstructive aspects of vision. Recently, techniques that avoid explicit reconstruction by placing visual feedback into a control loop have been developed. When properly defined, these methods lead to calibration insensitive hand-eye coordination. Recent work on projective geometry as applied to vision is used to extend this paradigm in two ways. First, it is shown how results from projective geometry can be used to perform online calibration. Second, results on projective invariance are used to define setpoints for visual control that are independent of viewing location. These ideas are illustrated through a number of examples and have been tested on an implemented system.