IEEE Transactions on Pattern Analysis and Machine Intelligence
Rational Filters for Passive Depth from Defocus
International Journal of Computer Vision
ACM SIGGRAPH Computer Graphics
Separation of Transparent Layers using Focus
International Journal of Computer Vision
Depth from Defocus vs. Stereo: How Different Really Are They?
International Journal of Computer Vision - Special issue on computer vision research at the Technion
A bin picking system based on depth from defocus
Machine Vision and Applications
Making 3D models of real world objects
Virtual space
Depth Measurement by the Multi-Focus Camera
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Microscopic vision system with all-in-focus and depth images
Machine Vision and Applications
Depth from Defocus vs. Stereo: How Different Really Are They?
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 2 - Volume 2
Image and depth from a conventional camera with a coded aperture
ACM SIGGRAPH 2007 papers
Auto-focus control of a CMOS image sensing module
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Generalized Laplacian as Focus Measure
ICCS '08 Proceedings of the 8th international conference on Computational Science, Part I
Vision-Based Mobile Robot Navigation Using Image Processing and Cell Decomposition
IVIC '09 Proceedings of the 1st International Visual Informatics Conference on Visual Informatics: Bridging Research and Practice
Webcam-based mobile robot path planning using Voronoi diagrams and image processing
AEE'10 Proceedings of the 9th WSEAS international conference on Applications of electrical engineering
Implementation roadmap using Voronoi diagrams for vision-based robot motion
WSEAS TRANSACTIONS on SYSTEMS
Video-rate capture of dynamic face shape and appearance
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Hi-index | 0.00 |
Structures of dynamic scenes can only be recovered using a real-time range sensor. Depth-from-defocus offers a direct solution to fast and dense range estimation. It is computationally efficient as it circumvents the correspondence problem faced by stereo and feature tracking in structure-from-motion. However, accurate depth estimation requires theoretical and practical solutions to a variety of problems including the recovery of textureless surfaces, precise blur estimation, and magnification variations caused by defocusing. Both textured and textureless surfaces are recovered using an illumination pattern that is projected via the same optical path used to acquire images. The illumination pattern is optimized to ensure maximum accuracy and spatial resolution in the computed depth. The relative blurring in two images is computed using a narrow-band linear operator that is designed by considering all the optical, sensing and computational elements of the depth-from-defocus system. Defocus-invariant magnification is achieved by the use of an additional aperture in the imaging optics. A prototype focus range sensor has been developed that produces up to 512/spl times/480 depth estimates at 30 Hz with an accuracy better than 0.3%. Several experimental results are included to demonstrate the performance of the sensor.