Handwritten digit recognition with a back-propagation network
Advances in neural information processing systems 2
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Online learning algorithm of kernel-based ternary classifiers using support vectors
Optical Memory and Neural Networks
Hi-index | 0.00 |
We propose a very simple learning algorithm, DirectSVM, for constructing support vector machine classifiers. This new algorithm is based on the proposition that the two closest training points of opposite class in a training set are support vectors, on condition that the training points in the set are linearly independent. The latter condition is always satisfied for soft-margin support vector machines with quadratic penalties. Other support vectors are found using the following conjecture: the training point that maximally violate the current hyperplane is also a support vector. We show that DirectSVM converges to a maximal margin hyperplane in M − 2 iterations, if the number of support vectors is M. DirectSVM is evaluated empirically on a number of standard databases. Performance-wise, the algorithm generalizes similarly as other implementations. Speed-wise, the proposed method is faster than a standard quadratic programming approach, while it has the potential to be competitive with current state-of-the-art SVM implementations.