Near Duplicate Identification With Spatially Aligned Pyramid Matching

  • Authors:
  • Dong Xu;Tat Jen Cham;Shuicheng Yan;Lixin Duan;Shih-Fu Chang

  • Affiliations:
  • Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore;-;-;-;-

  • Venue:
  • IEEE Transactions on Circuits and Systems for Video Technology
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new framework, termed spatially aligned pyramid matching, is proposed for near duplicate image identification. The proposed method robustly handles spatial shifts as well as scale changes, and is extensible for video data. Images are divided into both overlapped and non-overlapped blocks over multiple levels. In the first matching stage, pairwise distances between blocks from the examined image pair are computed using earth mover's distance (EMD) or the visual word with χ2 distance based method with scale-invariant feature transform (SIFT) features. In the second stage, multiple alignment hypotheses that consider piecewise spatial shifts and scale variation are postulated and resolved using integer-flow EMD. Moreover, to compute the distances between two videos, we conduct the third step matching (i.e., temporal matching) after spatial matching. Two application scenarios are addressed-near duplicate retrieval (NDR) and near duplicate detection (NDD). For retrieval ranking, a pyramid-based scheme is constructed to fuse matching results from different partition levels. For NDD, we also propose a dual-sample approach by using the multilevel distances as features and support vector machine for binary classification. The proposed methods are shown to clearly outperform existing methods through extensive testing on the Columbia Near Duplicate Image Database and two new datasets. In addition, we also discuss in depth our framework in terms of the extension for video NDR and NDD, the sensitivity to parameters, the utilization of multiscale dense SIFT descriptors, and the test of scalability in image NDD.