Improving Automatic Image Annotation Based on Word Co-occurrence

  • Authors:
  • H. Jair Escalante;Manuel Montes;L. Enrique Sucar

  • Affiliations:
  • Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México 72840;Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México 72840;Computer Science Department, National Institute of Astrophysics, Optics and Electronics, Puebla, México 72840

  • Venue:
  • Adaptive Multimedial Retrieval: Retrieval, User, and Semantics
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Accuracy of current automatic image labeling methods is under the requirements of annotation-based image retrieval systems. The performance of most of these labeling methods is poor if we just consider the most relevant label for a given region. However, if we look within the set of the top茂戮驴 kcandidate labels for a given region, accuracy of most of these systems is improved. In this paper we take advantage of this fact and propose a method (NBI) based on word co-occurrences that uses the naïve Bayes formulation for improving automatic image annotation methods. Our approach utilizes co-occurrence information of the candidate labels for a region with those candidate labels for the other surrounding regions, within the same image, for selecting the correct label. Co-occurrence information is obtained from an external collection of manually annotated images: the IAPR-TC12benchmark. Experimental results using a k茂戮驴nearest neighbors method as our annotation system, give evidence of significant improvements after applying the NBImethod. NBIis efficient since the co-occurrence information was obtained off-line. Furthermore, our method can be applied to any other annotation system that ranks labels by their relevance.