Hypergraph-based multi-example ranking with sparse representation for transductive learning image retrieval

  • Authors:
  • Chaoqun Hong;Jianke Zhu

  • Affiliations:
  • Faculty of Computer Science, Xiamen University of Technology, Ligong Road #600, Jimei, Xiamen, Fujian, 361024, China;College of Computer Science, Zhejiang University, Hangzhou, Zhejiang 310027, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Content-based image retrieval (CBIR) always suffers from the so-called semantic gap. Query-By-Multiple-Examples (QBME) is introduced to bridge it and applied in a lot of CBIR systems. However, current QBME methods usually query with each example separately and combine the query results. In this way, the computational time will increase linearly with the growing number of query examples. In this paper, we propose a novel QBME method for fast image retrieval based on transductive learning framework. To improve the speed of QBME, we introduce two improvements. First, we explore the semantic correlations of image data in the training process. These correlations are learned using sparse representation. With the semantic correlations, semantic correlation hypergraph (SCHG) is constructed to model the images and their correlations. The construction of SCHG is free of any parameter. After constructing SCHG, we predict the ranking values of images by using the pre-learned semantic correlations. Second, we propose a multiple probing strategy to rank the images with multiple query examples. Different from traditional QBME methods which accept one input example at a time, all the input examples are processed at the same time in this strategy. The experimental results demonstrate the effectiveness of the proposed method on both retrieval performance and speed.