Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Exact simplification of support vector solutions
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
An efficient method for simplifying support vector machines
ICML '05 Proceedings of the 22nd international conference on Machine learning
Building Sparse Large Margin Classifiers
ICML '05 Proceedings of the 22nd international conference on Machine learning
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Hi-index | 0.00 |
This paper presents an algorithm for reducing a classifier's complexity by pruning support vectors in learning the kernel matrix. The proposed algorithm retains the 'best' support vectors such that the span of support vectors, as defined by Vapnik and Chapelle, is as small as possible. Experiments on real world data sets show that the number of support vectors can be reduced in some cases by as much as 85% with little degradation in generalization performance.