Scene duplicate detection based on the pattern of discontinuities in feature point trajectories

  • Authors:
  • Xiaomeng Wu;Masao Takimoto;Shin'ichi Satoh;Jun Adachi

  • Affiliations:
  • National Institute of Informatics, Tokyo, Japan;The University of Tokyo, Tokyo, Japan;National Institute of Informatics, Tokyo, Japan;National Institute of Informatics, Tokyo, Japan

  • Venue:
  • MM '08 Proceedings of the 16th ACM international conference on Multimedia
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

The paper is aiming to detect and retrieve videos of the same scene (scene duplicates) from broadcast video archives. Scene duplicate is composed of different pieces of footage of the same scene, the same event, at the same time, but from the different viewpoints. Scene duplicate detection would be particularly useful to identify the same event reported in different programs from different broadcast stations. The approach should be invariant to viewpoint changes. We focused on object motion in videos and devised a video matching approach based on the temporal pattern of discontinuities obtained from feature point trajectories. We developed an acceleration method based on the discontinuity pattern, which is more robust to variations in camerawork and editing than conventional features, to dramatically reduce the computation burden. We compared our approach with an existing video matching method based on the local feature of keyframe. The spatial registration strategy of this method was also used with the proposed approach to cope with visually different unrelated video pairs. The performance and effectiveness of our approach was demonstrated on actual broadcasted videos.