Training SVM with indefinite kernels

  • Authors:
  • Jianhui Chen;Jieping Ye

  • Affiliations:
  • Arizona State University, Tempe, AZ;Arizona State University, Tempe, AZ

  • Venue:
  • Proceedings of the 25th international conference on Machine learning
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Similarity matrices generated from many applications may not be positive semidefinite, and hence can't fit into the kernel machine framework. In this paper, we study the problem of training support vector machines with an indefinite kernel. We consider a regularized SVM formulation, in which the indefinite kernel matrix is treated as a noisy observation of some unknown positive semidefinite one (proxy kernel) and the support vectors and the proxy kernel can be computed simultaneously. We propose a semi-infinite quadratically constrained linear program formulation for the optimization, which can be solved iteratively to find a global optimum solution. We further propose to employ an additional pruning strategy, which significantly improves the efficiency of the algorithm, while retaining the convergence property of the algorithm. In addition, we show the close relationship between the proposed formulation and multiple kernel learning. Experiments on a collection of benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithm.