Real-time near-duplicate web video identification by tracking and matching of spatial features

  • Authors:
  • Kyung-Wook Park;Jee-Uk Heu;Bo-kyeong Kim;Dong-Ho Lee

  • Affiliations:
  • Hanyang University, Ansan, Kyeonggi-do, Republic of Korea;Hanyang University, Ansan, Kyeonggi-do, Republic of Korea;Hanyang University, Ansan, Kyeonggi-do, Republic of Korea;Hanyang University, Ansan, Kyeonggi-do, Republic of Korea

  • Venue:
  • Proceedings of the 7th International Conference on Ubiquitous Information Management and Communication
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the exponential growth of the Web, real-time near-duplicate Web video identification is becoming more and more important due to its wide spectrum of applications including copyright detection and commercial monitoring. Though there has been significant research effort on efficiently identifying near-duplicates from large video collections, most of them use global features sensitive to photometric variations such as illumination direction, intensity, colors, and highlights. This paper proposes a novel local feature based approach in order to address the efficiency and scalability issues for near-duplicate Web video identification. Firstly, in order to represent the shot, we introduce a compact spatial signature generated with trajectories of the patches. And then, we construct an efficient data structure which indexes the spatial signatures to find the corresponding shots from query video. Finally, we adopt naive-Bayesian approach to estimate the near-duplicates from the set of corresponding shots. To demonstrate the effectiveness and efficiency of the proposed method, we evaluate its performance on an open Web video data set containing about 10K Web videos.