On the Annotation of Web Videos by Efficient Near-Duplicate Search

  • Authors:
  • Wan-Lei Zhao;Xiao Wu;Chong-Wah Ngo

  • Affiliations:
  • Dept. of Comput. Sci., City Univ. of Hong Kong, Kowloon, China;-;-

  • Venue:
  • IEEE Transactions on Multimedia
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the proliferation of Web 2.0 applications, user-supplied social tags are commonly available in social media as a means to bridge the semantic gap. On the other hand, the explosive expansion of social web makes an overwhelming number of web videos available, among which there exists a large number of near-duplicate videos. In this paper, we investigate techniques which allow effective annotation of web videos from a data-driven perspective. A novel classifier-free video annotation framework is proposed by first retrieving visual duplicates and then suggesting representative tags. The significance of this paper lies in the addressing of two timely issues for annotating query videos. First, we provide a novel solution for fast near-duplicate video retrieval. Second, based on the outcome of near-duplicate search, we explore the potential that the data-driven annotation could be successful when huge volume of tagged web videos is freely accessible online. Experiments on cross sources (annotating Google videos and Yahoo! videos using YouTube videos) and cross time periods (annotating YouTube videos using historical data) show the effectiveness and efficiency of the proposed classifier-free approach for web video tag annotation.