Recovering high dynamic range radiance maps from photographs
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
Motion-Based Motion Deblurring
IEEE Transactions on Pattern Analysis and Machine Intelligence
Image deblurring with blurred/noisy image pairs
ACM SIGGRAPH 2007 papers
A regularization approach to joint blur identification and image restoration
IEEE Transactions on Image Processing
Total variation blind deconvolution
IEEE Transactions on Image Processing
Comparametric equations with practical applications in quantigraphic image processing
IEEE Transactions on Image Processing
The curvelet transform for image denoising
IEEE Transactions on Image Processing
The Frankencamera: an experimental platform for computational photography
ACM SIGGRAPH 2010 papers
Low-light imaging solutions for mobile devices
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
Multi-exposure imaging on mobile devices
Proceedings of the international conference on Multimedia
Multi-exposure imaging on mobile devices (demo)
Proceedings of the international conference on Multimedia
The FCam API for programmable cameras
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Exposure stacks of live scenes with hand-held cameras
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part I
Image-pair-based deblurring with spatially varying norms and noisy image updating
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
We present an image enhancement algorithm based on fusing the visual information present in two images of the same scene, captured with different exposure times. The main idea is to exploit the differences between the image degradations that affect the two images. On one hand the short-exposed image is less affected by motion blur, whereas the long-exposed image is less affected by noise. Different fusion rules are designed for the luminance and chrominance components such that to preserve the desirable properties from each input image. We also present a method for estimating the brightness transfer function between the input images for photometric calibration of the short-exposed image with respect to the long-exposed image. As no global blur PSF is assumed, our method can deal with blur from both camera and object motions. We demonstrate the algorithm by a series of experiments and simulations.