Detection and location of near-duplicate video sub-clips by finding dense subgraphs

  • Authors:
  • Tianlong Chen;Shuqiang Jiang;Lingyang Chu;Qingming Huang

  • Affiliations:
  • Institute of Computing Technology Chinese Academy of Sciences, Beijing, China;Institute of Computing Technology Chinese Academy of Sciences, Beijing, China;Institute of Computing Technology Chinese Academy of Sciences, Beijing, China;Graduate University of Chinese Academy of Sciences, Beijing, China

  • Venue:
  • MM '11 Proceedings of the 19th ACM international conference on Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Robust and fast near-duplicate video detection is an important task with many potential applications. Most existing systems focus on the comparison between full copy videos or partial near-duplicate videos. While it is more challenging to find similar content for videos containing multiple near-duplicate segments at random locations with various connections. In this paper, we propose a new graph based method to detect complex near-duplicate video sub-clips. First, we develop a new succinct video descriptor for keyframe match. Then a graph is established to exploit temporal consistency of matched keyframes. The nodes of the graph are the matched frame pairs; the edge weights are computed from the temporal alignment and frame pair similarities. In this way, the validly matched keyframes would form a dense subgraph whose nodes are strongly connected. This graph model also preserves the complex connections of sub-clips. Thus detecting complex near-duplicate sub-clips is transformed to the problem of finding all the dense subgraphs. We employ the optimization method of graph shift to solve this problem due to its robust performance. The experiments are conducted on the dataset with various transformations and complex temporal relations. The results demonstrate the effectiveness and efficiency of the proposed method.