Context-Based image similarity queries

  • Authors:
  • Ilaria Bartolini

  • Affiliations:
  • DEIS – IEIIT-BO/CNR, University of Bologna, Italy

  • Venue:
  • AMR'05 Proceedings of the Third international conference on Adaptive Multimedia Retrieval: user, context, and feedback
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper an effective context-based approach for interactive similarity queries is presented. By exploiting the notion of image “context”, it is possible to associate different meanings to the same query image. This is indeed necessary to model complex query concepts that, due to their nature, cannot be effectively represented without contextualize the target image. The context model is simple yet effective and consists of a set of significant images (possibly not relevant to the query) that describe the semantic meaning the user is interested in. When feedback is present, the query context assumes a dynamic nature, changing over time depending on the actual retrieved images judged as relevant by the user for her current search task. Moreover, the proposed approach is able to complement the role of relevance feedback by persistently maintaining the query parameters determined through user interaction over time and ensuring search efficiency. Experimental results on a database of about 10,000 images show the high quality contribution of the proposed approach.