Learning with non-metric proximity matrices

  • Authors:
  • Gang Wu;Edward Y. Chang;Zhihua Zhang

  • Affiliations:
  • University of California, Santa Barbara, CA;University of California, Santa Barbara, CA;University of California, Santa Barbara, CA

  • Venue:
  • Proceedings of the 13th annual ACM international conference on Multimedia
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many emerging applications formulate non-metric proximity matrices (non-positive semidefinite), and hence cannot fit into the framework of kernel machines. A popular approach to this problem is to transform the spectrum of the similarity matrix so as to generate a positive semidefinite kernel matrix. In this paper, we explore four representative transformation methods: denoise, flip, diffusion, and shift. Theoretically, we discuss a generalization problem where the test data are not available during transformation, and thus propose an efficient algorithm to address the problem of updating the cross-similarity matrix between test and training data. Extensive experiments have been conducted to evaluate the performance of these methods on several real-world (dis)similarity matrices with semantic meanings.