A robust spatial-temporal line-warping based deinterlacing method

  • Authors:
  • Shing-Fat Tu;Oscar C. Au;Yannan Wu;Enming Luo;Chi-Ho Yeung

  • Affiliations:
  • Dept. of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong;Dept. of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong;Dept. of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong;Dept. of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong;Dept. of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Kowloon, Hong Kong

  • Venue:
  • ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a line-warping based deinterlacing method will be introduced. The missing pixels in interlaced videos can be derived from the warping of pixels in horizontal line pairs. In order to increase the accuracy of temporal prediction, multiple temporal-line pairs, selected according to constant velocity model, are used for warping. The stationary pixels can be well-preserved by accuracy stationary detection. A soft switching between spatial-temporal interpolated values and temporal average is introduced in order to prevent unstable switching. Owing to above novelties, the proposed method can yield higher visual quality deinterlaced videos than conventional methods. Moreover, this method can suppress most deinterlaced visual artifacts, such as line-crawling, flickering and ghost-shadow.