A Computational Approach to Edge Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
A New Sense for Depth of Field
IEEE Transactions on Pattern Analysis and Machine Intelligence
Local Scale Control for Edge Detection and Blur Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Digital photography with flash and no-flash image pairs
ACM SIGGRAPH 2004 Papers
Colorization using optimization
ACM SIGGRAPH 2004 Papers
A Geometric Approach to Shape from Defocus
IEEE Transactions on Pattern Analysis and Machine Intelligence
Interactive local adjustment of tonal values
ACM SIGGRAPH 2006 Papers
Active refocusing of images and videos
ACM SIGGRAPH 2007 papers
Image and depth from a conventional camera with a coded aperture
ACM SIGGRAPH 2007 papers
A Closed-Form Solution to Natural Image Matting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Shape from Defocus via Diffusion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Light mixture estimation for spatially varying white balance
ACM SIGGRAPH 2008 papers
Single image defocus map estimation using local contrast prior
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Single Image Haze Removal Using Dark Channel Prior
IEEE Transactions on Pattern Analysis and Machine Intelligence
Joint statistical analysis of images and keywords with applications in semantic image enhancement
Proceedings of the 20th ACM international conference on Multimedia
Semantic awareness for automatic image interpretation
Proceedings of the 20th ACM international conference on Multimedia
Depth recovery from a single defocused image based on depth locally consistency
Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
Optimized aperture shapes for depth estimation
Pattern Recognition Letters
Hi-index | 0.01 |
In this paper, we address the challenging problem of recovering the defocus map from a single image. We present a simple yet effective approach to estimate the amount of spatially varying defocus blur at edge locations. The input defocused image is re-blurred using a Gaussian kernel and the defocus blur amount can be obtained from the ratio between the gradients of input and re-blurred images. By propagating the blur amount at edge locations to the entire image, a full defocus map can be obtained. Experimental results on synthetic and real images demonstrate the effectiveness of our method in providing a reliable estimation of the defocus map.