Near-Duplicate Video Detection Using Temporal Patterns of Semantic Concepts

  • Authors:
  • Hyun-seok Min;JaeYoung Choi;Wesley De Neve;Yong Man Ro

  • Affiliations:
  • -;-;-;-

  • Venue:
  • ISM '09 Proceedings of the 2009 11th IEEE International Symposium on Multimedia
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Methods for video copy detection are typically based on the use of low-level visual features. However, low-level features may vary significantly for near-duplicates, which are video sequences that have been the subject of spatial or temporal modifications. As such, the use of low-level visual features may be inadequate for detecting near-duplicates. In this paper, we present a new video copy detection method that aims to identify near-duplicates for a given query video sequence. More specifically, the proposed method is based on identifying semantic concepts along the temporal axis of a particular video sequence, resulting in the construction of a so-called semantic video signature. The semantic video signature is then used for the purpose of similarity measurement. The main advantage of the proposed method lies in the fact that the presence of semantic concepts is highly robust to spatial and temporal video transformations. Our experimental results show that the use of a semantic video signature allows for the efficient and effective detection of near-duplicates.