A Reformulation of Support Vector Machines for General Confidence Functions

  • Authors:
  • Yuhong Guo;Dale Schuurmans

  • Affiliations:
  • Department of Computer & Information Sciences, Temple University,;Department of Computing Science, University of Alberta,

  • Venue:
  • ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a generalized view of support vector machines that does not rely on a Euclidean geometric interpretation nor even positive semidefinite kernels. We base our development instead on the confidence matrix --the matrix normally determined by the direct (Hadamard) product of the kernel matrix with the label outer-product matrix. It turns out that alternative forms of confidence matrices are possible, and indeed useful. By focusing on the confidence matrix instead of the underlying kernel, we can derive an intuitive principle for optimizing example weights to yield robust classifiers. Our principle initially recovers the standard quadratic SVM training criterion, which is only convex for kernel-derived confidence measures. However, given our generalized view, we are then able to derive a principled relaxation of the SVM criterion that yields a convex upper bound. This relaxation is always convex and can be solved with a linear program. Our new training procedure obtains similar generalization performance to standard SVMs on kernel-derived confidence functions, but achieves even better results with indefinite confidence functions.