Floating search methods in feature selection
Pattern Recognition Letters
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Sparseness of support vector machines
The Journal of Machine Learning Research
A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs
The Journal of Machine Learning Research
Building Sparse Large Margin Classifiers
ICML '05 Proceedings of the 22nd international conference on Machine learning
On a theory of learning with similarity functions
ICML '06 Proceedings of the 23rd international conference on Machine learning
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Machines (SVMs) for classification tasks produce sparse models by maximizing the margin. Two limitations of this technique are considered in this work: firstly, the number of support vectors can be large and, secondly, the model requires the use of (Mercer) kernel functions. Recently, some works have proposed to maximize the margin while controlling the sparsity. These works also require the use of kernels. We propose a search process to select a subset of basis functions that maximize the margin without the requirement of being kernel functions. The sparsity of the model can be explicitly controlled. Experimental results show that accuracy close to SVMs can be achieved with much higher sparsity. Further, given the same level of sparsity, more powerful search strategies tend to obtain better generalization rates than simpler ones.