A collaborative Bayesian image retrieval framework

  • Authors:
  • Rui Zhang; Ling Guan

  • Affiliations:
  • Ryerson Multimedia Research Laboratory, Ryerson University, Toronto, Canada;Ryerson Multimedia Research Laboratory, Ryerson University, Toronto, Canada

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, an image retrieval framework combining content-based and content-free methods is proposed, which employs both short-term relevance feedback (STRF) and long-term relevance feedback (LTRF) as the means of user interaction. The STRF refers to iterative query-specific model learning during a retrieval session, and the LTRF is the estimation of a user history model from the past retrieval results approved by previous users. The framework is formulated based on the Bayes' theorem, in which the results from STRF and LTRF play the roles of refining the likelihood and the a priori information, respectively, and the images are ranked according to the a posteriori probability. Since the estimation of the user history model is based on the principle of collaborative filtering, the system is referred to as a collaborative Bayesian image retrieval (CLBIR) framework. To evaluate the effectiveness of the proposed framework, nearest neighbor CLBIR (NN-CLBIR) and support vector machine active learning CLBIR (SVMAL-CLBIR) were implemented. Experimental results showed the improvement over content-based methods in terms of both accuracy and ranking due to the integration in the proposed framework.