A New Baseline for Image Annotation

  • Authors:
  • Ameesh Makadia;Vladimir Pavlovic;Sanjiv Kumar

  • Affiliations:
  • Google Research, New York NY, ;Rutgers University, Piscataway, ;Google Research, New York NY,

  • Venue:
  • ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part III
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automatically assigning keywords to images is of great interest as it allows one to index, retrieve, and understand large collections of image data. Many techniques have been proposed for image annotation in the last decade that give reasonable performance on standard datasets. However, most of these works fail to compare their methods with simple baseline techniques to justify the need for complex models and subsequent training. In this work, we introduce a new baseline technique for image annotation that treats annotation as a retrieval problem. The proposed technique utilizes low-level image features and a simple combination of basic distances to find nearest neighbors of a given image. The keywords are then assigned using a greedy label transfer mechanism. The proposed baseline outperforms the current state-of-the-art methods on two standard and one large Web dataset. We believe that such a baseline measure will provide a strong platform to compare and better understand future annotation techniques.