A Multi-Directional Search technique for image annotation propagation

  • Authors:
  • Ning Yu;Kien A. Hua;Hao Cheng

  • Affiliations:
  • Department of Electrical Engineering & Computer Science, University of Central Florida, 4000 Central Florida Blvd., Orlando, FL 32816, USA;Department of Electrical Engineering & Computer Science, University of Central Florida, 4000 Central Florida Blvd., Orlando, FL 32816, USA;Department of Electrical Engineering & Computer Science, University of Central Florida, 4000 Central Florida Blvd., Orlando, FL 32816, USA

  • Venue:
  • Journal of Visual Communication and Image Representation
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Image annotation has attracted lots of attention due to its importance in image understanding and search areas. In this paper, we propose a novel Multi-Directional Search framework for semi-automatic annotation propagation. In this system, the user interacts with the system to provide example images and the corresponding annotations during the annotation propagation process. In each iteration, the example images are clustered and the corresponding annotations are propagated separately to each cluster: images in the local neighborhood are annotated. Furthermore, some of those images are returned to the user for further annotation. As the user marks more images, the annotation process goes into multiple directions in the feature space. The query movements can be treated as multiple path navigation. Each path could be further split based on the user's input. In this manner, the system provides accurate annotation assistance to the user - images with the same semantic meaning but different visual characteristics can be handled effectively. From comprehensive experiments on Corel and U. of Washington image databases, the proposed technique shows accuracy and efficiency on annotating image databases.