Correlation-based retrieval for heavily changed near-duplicate videos

  • Authors:
  • Jiajun Liu;Zi Huang;Heng Tao Shen;Bin Cui

  • Affiliations:
  • The University of Queensland, Australia;The University of Queensland, Australia;The University of Queensland, Australia;Peking University, China

  • Venue:
  • ACM Transactions on Information Systems (TOIS)
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The unprecedented and ever-growing number of Web videos nowadays leads to the massive existence of near-duplicate videos. Very often, some near-duplicate videos exhibit great content changes, while the user perceives little information change, for example, color features change significantly when transforming a color video with a blue filter. These feature changes contribute to low-level video similarity computations, making conventional similarity-based near-duplicate video retrieval techniques incapable of accurately capturing the implicit relationship between two near-duplicate videos with fairly large content modifications. In this paper, we introduce a new dimension for near-duplicate video retrieval. Different from existing near-duplicate video retrieval approaches which are based on video-content similarity, we explore the correlation between two videos. The intuition is that near-duplicate videos should preserve strong information correlation in spite of intensive content changes. More effective retrieval with stronger tolerance is achieved by replacing video-content similarity measures with information correlation analysis. Theoretical justification and experimental results prove the effectiveness of correlation-based near-duplicate retrieval.