Annotating images by harnessing worldwide user-tagged photos

  • Authors:
  • Xirong Li;Cees G. M. Snoek;Marcel Worring

  • Affiliations:
  • ISLA, Informatics Institute, University of Amsterdam, Science Park 107, 1098 XG, The Netherlands;ISLA, Informatics Institute, University of Amsterdam, Science Park 107, 1098 XG, The Netherlands;ISLA, Informatics Institute, University of Amsterdam, Science Park 107, 1098 XG, The Netherlands

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automatic image tagging is important yet challenging due to the semantic gap and the lack of learning examples to model a tag's visual diversity. Meanwhile, social user tagging is creating rich multimedia content on the web. In this paper, we propose to combine the two tagging approaches in a search-based framework. For an unlabeled image, we first retrieve its visual neighbors from a large user-tagged image database. We then select relevant tags from the result images to annotate the unlabeled image. To tackle the unreliability and sparsity of user tagging, we introduce a joint-modality tag relevance estimation method which efficiently addresses both textual and visual clues. Experiments on 1.5 million Flickr photos and 10 000 Corel images verify the proposed method.