A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Support vector machines: hype or hallelujah?
ACM SIGKDD Explorations Newsletter - Special issue on “Scalable data mining algorithms”
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Machines are classifiers designed around the computation of an optimal separating hyperplane. This hyperplane is typically obtained by solving a constrained quadratic programming problem, but may also be located by solving a nearest point problem. Gilbert's Algorithm can be used to solve this nearest point problem but is unreasonably slow. In this paper we present a modified version of Gilbert's Algorithm for the fast computation of the Support Vector Machine hyperplane. We then compare our algorithm with the Nearest Point Algorithm and with Sequential Minimal Optimization.