Parametric local multimodal hashing for cross-view similarity search

  • Authors:
  • Deming Zhai;Hong Chang;Yi Zhen;Xianming Liu;Xilin Chen;Wen Gao

  • Affiliations:
  • School of Computer Science and Technology, Harbin Institute of Technology, Harbin, China;Key Lab. of Intelligent Information Processing, Institute of Computing Technology, CAS, China;Hong Kong University of Science and Technology, Hong Kong, China;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, China;Key Lab. of Intelligent Information Processing, Institute of Computing Technology, CAS, China;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, China

  • Venue:
  • IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent years have witnessed the growing popularity of hashing for efficient large-scale similarity search. It has been shown that the hashing quality could be boosted by hash function learning (HFL). In this paper, we study HFL in the context of multimodal data for cross-view similarity search. We present a novel multimodal HFL method, called Parametric Local Multimodal Hashing (PLMH), which learns a set of hash functions to locally adapt to the data structure of each modality. To balance locality and computational efficiency, the hashing projection matrix of each instance is parameterized, with guaranteed approximation error bound, as a linear combination of basis hashing projections of a small set of anchor points. A local optimal conjugate gradient algorithm is designed to learn the hash functions for each bit, and the overall hash codes are learned in a sequential manner to progressively minimize the bias. Experimental evaluations on cross-media retrieval tasks demonstrate that PLMH performs competitively against the state-of-the-art methods.