Reference-guided exposure fusion in dynamic scenes

  • Authors:
  • Wei Zhang;Wai-Kuen Cham

  • Affiliations:
  • Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, CA 94720, USA;Department of Electronic Engineering, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong

  • Venue:
  • Journal of Visual Communication and Image Representation
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Unlike high dynamic range (HDR) imaging, exposure fusion is a process of generating a tonemapped-like HDR image directly by fusing a series of bracketed images. Since it frees users from the tedious radiometric calibration and tone mapping steps, this technique is getting more and more popular, and becomes a basic tool in many graphics software. The main drawback of exposure fusion is its limitation to static scenes and any object movement of the target scene will incur severe ghosting artifacts in the fused result. In this paper, we intend to overcome this limitation and make exposure fusion applicable in dynamic scenes. A new quality assessment system is developed, where both temporal consistency and spatial consistency are introduced to account for ghosting artifacts. Experimental results of various dynamic scenes are shown to prove the effectiveness of the proposed method.