Improving Video Stabilization in the Presence of Motion Blur

  • Authors:
  • Manish Okade;P. K. Biswas

  • Affiliations:
  • -;-

  • Venue:
  • NCVPRIPG '11 Proceedings of the 2011 Third National Conference on Computer Vision, Pattern Recognition, Image Processing and Graphics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose the idea of deblurring for efficient feature matching in the context of video stabilization. This is achieved by incorporating a motion deblurring block prior to the feature extraction and matching stage in the stabilization pipeline. The effect of motion blur on feature extraction and matching has not been investigated so far as most of the approaches have looked into deblurring as a post processing step. After preprocessing the blurred frames, the scale invariant feature transform points are used to find correspondence to get an estimate of the camera motion. Smoothing of the motion parameters to retain the desired motion is performed using a gaussian filter. Finally, inverse of the resulting image transform is carried out to obtain a stable video sequence. We compare our method to existing approaches and show how the inserted block improves the video stabilization performance. Interframe Transformation Fidelity (ITF) is used to show the superiority of our proposed approach.