Utilizing Related Samples to Enhance Interactive Concept-Based Video Search

  • Authors:
  • Jin Yuan;Zheng-Jun Zha;Yan-Tao Zheng;Meng Wang;Xiangdong Zhou;Tat-Seng Chua

  • Affiliations:
  • School of Computing, National University of Singapore,;School of Computing, National University of Singapore,;Institute for Infocomm Research $({\'rm I}^{2}{\'rm R})$, Singapore,;school of computing, National university of Singapore, Singapore, Singapore;Computing Department, Fudan University of China,;School of Computing, National University of Singapore,

  • Venue:
  • IEEE Transactions on Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the main challenges in interactive concept-based video search is the problem of insufficient relevant samples, especially for queries with complex semantics. In this paper, “related samples” are exploited to enhance interactive video search. The related samples refer to those video segments that are relevant to part of the query rather than the entire query. Compared to the relevant samples which may be rare, the related samples are usually plentiful and easy to find in search results. Generally, the related samples are visually similar and temporally neighboring to the relevant samples. Based on these two characters, we develop a visual ranking model that simultaneously exploits the relevant, related, and irrelevant samples, as well as a temporal ranking model to leverage the temporal relationship between related and relevant samples. An adaptive fusion method is then proposed to optimally explore these two ranking models to generate search results. We conduct extensive experiments on two real-world video datasets: TRECVID 2008 and YouTube datasets. As the experimental results show, our approach achieves at least 96% and 167% performance improvements against the state-of-the-art approaches on the TRECVID 2008 and YouTube datasets, respectively.