Recovering high dynamic range radiance maps from photographs
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
Photographic tone reproduction for digital images
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
ACM SIGGRAPH 2003 Papers
Adaptive Dynamic Range Imaging: Optical Control of Pixel Exposures Over Space and Time
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Optical Splitting Trees for High-Precision Monocular Imaging
IEEE Computer Graphics and Applications
PatchMatch: a randomized correspondence algorithm for structural image editing
ACM SIGGRAPH 2009 papers
The Frankencamera: an experimental platform for computational photography
ACM SIGGRAPH 2010 papers
Beyond pixels: exploring new representations and applications for motion analysis
Beyond pixels: exploring new representations and applications for motion analysis
Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence
A versatile HDR video production system
ACM SIGGRAPH 2011 papers
Robust patch-based hdr reconstruction of dynamic scenes
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2012
Hi-index | 0.00 |
Despite significant progress in high dynamic range (HDR) imaging over the years, it is still difficult to capture high-quality HDR video with a conventional, off-the-shelf camera. The most practical way to do this is to capture alternating exposures for every LDR frame and then use an alignment method based on optical flow to register the exposures together. However, this results in objectionable artifacts whenever there is complex motion and optical flow fails. To address this problem, we propose a new approach for HDR reconstruction from alternating exposure video sequences that combines the advantages of optical flow and recently introduced patch-based synthesis for HDR images. We use patch-based synthesis to enforce similarity between adjacent frames, increasing temporal continuity. To synthesize visually plausible solutions, we enforce constraints from motion estimation coupled with a search window map that guides the patch-based synthesis. This results in a novel reconstruction algorithm that can produce high-quality HDR videos with a standard camera. Furthermore, our method is able to synthesize plausible texture and motion in fast-moving regions, where either patch-based synthesis or optical flow alone would exhibit artifacts. We present results of our reconstructed HDR video sequences that are superior to those produced by current approaches.