Robust video stabilization based on particle filter tracking of projected camera motion

  • Authors:
  • Junlan Yang;Dan Schonfeld;Magdi Mohamed

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of Illinois, Chicago, IL;Department of Electrical and Computer Engineering, University of Illinois, Chicago, IL;Physical and Digital Realization Research Center of Excellence, Motorola Labs, Schaumburg, IL

  • Venue:
  • IEEE Transactions on Circuits and Systems for Video Technology
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Video stabilization is an important technique in digital cameras. Its impact increases rapidly with the rising popularity of handheld cameras and cameras mounted on moving platforms (e.g., cars). Stabilization of two images can be viewed as an image registration problem. However, to ensure the visual quality of the whole video, video stabilization has a particular emphasis on the accuracy and robustness over long image sequences. In this paper, we propose a novel technique for video stabilization based on the particle filtering framework. We extend the traditional use of particle filters in object tracking to tracking of the projected affine model of the camera motions. We rely on the inverse of the resulting image transform to obtain a stable video sequence. The correspondence between scale-invariant feature transform points is used to obtain a crude estimate of the projected camera motion. We subsequently postprocess the crude estimate with particle filters to obtain a smooth estimate. It is shown both theoretically and experimentally that particle filtering can reduce the error variance compared to estimation without particle filtering. The superior performance of our algorithm over other methods for video stabilization is demonstrated through computer simulated experiments.