Camera Geometries for Image Matching in 3-D Machine Vision
IEEE Transactions on Pattern Analysis and Machine Intelligence
Integrated planning of robotic and computer vision based spatial reasoning tasks
IEA/AIE '90 Proceedings of the 3rd international conference on Industrial and engineering applications of artificial intelligence and expert systems - Volume 1
Computer Vision
Integrated planning of robotic and computer vision based spatial reasoning tasks
IEA/AIE '90 Proceedings of the 3rd international conference on Industrial and engineering applications of artificial intelligence and expert systems - Volume 1
Hi-index | 0.00 |
One of the fundamental difficulties that arises when attempting to use computer vision in dynamic environments is that camera calibration coefficients must be adjusted as the relative distances between camera and target object change, causing refocusing to occur. Such situations arise frequently in robotic environments in which the visual sensor is mobile or the target objects are in motion. This paper presents a method for computing camera calibration coefficients for cases in which it is known that the relative motion between camera and target object is a translation along the optical axis, as in cases for which the camera is moving directly toward or away from an object of interest. The calibration technique is straightforward, involving only the solution of linear equations. It is demonstrated that, within the context of a spatial reasoning system, inclusion of the calibration method can improve the relative accuracy of spatial inferences by one to two orders of magnitude.