Approximate nearest neighbor search to support manual image annotation of large domain-specific datasets

  • Authors:
  • Bastiaan J. Boom;Phoenix X. Huang;Robert B. Fisher

  • Affiliations:
  • University of Edinburgh;University of Edinburgh;University of Edinburgh

  • Venue:
  • Proceedings of the International Workshop on Video and Image Ground Truth in Computer Vision Applications
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The annotation of large datasets containing domain-specific images is both time-consuming and difficult. However, currently computer vision and machine learning methods have to deal with ever increasing amounts of data, where annotation of this data is essential. The annotated images allow these kind of methods to learn the variation in large datasets and evaluate methods based on large datasets. This paper presents a method for annotation of domain-specific (fish species) images using approximate nearest neighbor search to retrieve similar fish species in a large set (216,501) of images. The approximate nearest neighbor search allows us to find a ranked set of images in large datasets. Presenting similar images to users allows them to annotate images much more efficiently. In this case, our user interface present these images in such a way that the user does not need to have knowledge of a specific domain to contribute in the annotation of images.