A New Sense for Depth of Field
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Investigation of Methods for Determining Depth from Focus
IEEE Transactions on Pattern Analysis and Machine Intelligence
Depth from defocus: a spatial domain approach
International Journal of Computer Vision
Recovering high dynamic range radiance maps from photographs
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
Minimal operator set for passive depth from defocus
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
A Geometric Approach to Shape from Defocus
IEEE Transactions on Pattern Analysis and Machine Intelligence
Projection defocus analysis for scene capture and image display
ACM SIGGRAPH 2006 Papers
The Reverse Projection Correlation Principle for Depth from Defocus
3DPVT '06 Proceedings of the Third International Symposium on 3D Data Processing, Visualization, and Transmission (3DPVT'06)
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Hi-index | 0.00 |
Depth from defocus (DFD) is a 3D recovery method based on estimating the amount of defocus induced by finite lens apertures. Given two images with different camera settings, the problem is to measure the resulting differences in defocus across the image, and to estimate a depth based on these blur differences. Most methods assume that the scene depth map is locally smooth, and this leads to inaccurate depth estimates near discontinuities. In this paper, we propose a novel DFD method that avoids smoothing over discontinuities by iteratively modifying an elliptical image region over which defocus is estimated. Our method can be used to complement any depth from defocus method based on spatial domain measurements. In particular, this method improves the DFD accuracy near discontinuities in depth or surface orientation.