DirectSVM: A Simple Support Vector Machine Perceptron

  • Authors:
  • Danny Roobaert

  • Affiliations:
  • Artificial Intelligence Lab, Department of Computer Science, University of Toronto, Toronto ON M5S 3H5, Canada

  • Venue:
  • Journal of VLSI Signal Processing Systems
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a very simple learning algorithm, DirectSVM, for constructing support vector machine classifiers. This new algorithm is based on the proposition that the two closest training points of opposite class in a training set are support vectors, on condition that the training points in the set are linearly independent. The latter condition is always satisfied for soft-margin support vector machines with quadratic penalties. Other support vectors are found using the following conjecture: the training point that maximally violate the current hyperplane is also a support vector. We show that DirectSVM converges to a maximal margin hyperplane in M − 2 iterations, if the number of support vectors is M. DirectSVM is evaluated empirically on a number of standard databases. Performance-wise, the algorithm generalizes similarly as other implementations. Speed-wise, the proposed method is faster than a standard quadratic programming approach, while it has the potential to be competitive with current state-of-the-art SVM implementations.