Learning user's intent using user tags: intelligent interactive image search system

  • Authors:
  • Viktors Garkavijs

  • Affiliations:
  • National Institute of Informatics, Tokyo, Japan

  • Venue:
  • Proceedings of the sixth international workshop on Exploiting semantic annotations in information retrieval
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The contents of the image files can be completely different per se, if we consider the descriptors such as color histograms, textures, edge histograms, etc., however they might have the same semantic metadata, such as user tags, at the same time. The exploitation of user tags in image search tasks results in lower computational costs considering much smaller number of features that are necessary to compute the similarity between the objects, as well as it helps to overcome the semantic gap (this kind of similarity is, of course, different from the one computed in content-based methods). Thanks to these two properties it becomes very easy to create an interactive image search system that is based on online learning from interaction between the user and the system. We propose a simple algorithm for system training, that uses dwell-time data as input parameters for relevance recalculation, which we implemented in the prototype of our experimental search system.