A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Numerical Recipes in C++: the art of scientific computing
Numerical Recipes in C++: the art of scientific computing
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
An efficient method for simplifying support vector machines
ICML '05 Proceedings of the 22nd international conference on Machine learning
Multiclass reduced-set support vector machines
ICML '06 Proceedings of the 23rd international conference on Machine learning
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution --- support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.