Utilizing related samples to learn complex queries in interactive concept-based video search

  • Authors:
  • Jin Yuan;Zheng-Jun Zha;Zhengdong Zhao;Xiangdong Zhou;Tat-Seng Chua

  • Affiliations:
  • National University of Singapore;National University of Singapore;National University of Singapore;Fudan University, China;National University of Singapore

  • Venue:
  • Proceedings of the ACM International Conference on Image and Video Retrieval
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the main challenges in interactive concept-based video search is the insufficient relevant sample problem, especially for queries with complex semantics. To address this problem, in this paper, we propose to utilize "related samples" to learn the complex queries. The "related samples" refer to those video segments that are irrelevant to the query but relevant to some of the related concepts of the query. Different from the relevant samples which may be rare, the related samples are usually sufficient and easy to find in the search result list. Specifically, we learn a detector for the query by simultaneously leveraging the related concept detectors, as well as users' feedbacks including relevant, irrelevant, and related samples. The query detector is then employed to predict the presence of the query in new video segments. As a result, new search results can be obtained according to the query presence. Furthermore, our approach is developed based on incremental learning technique. Thus, the query detector can be efficiently updated in each feedback iteration. We conduct experiments on two real-world video datasets: TRECVID 2008 and Youtube datasets. The experimental results demonstrate the effectiveness and efficiency of the proposed approach.