The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
On the Stability of Null-Space Methods for KKT Systems
SIAM Journal on Matrix Analysis and Applications
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Creating an ensemble of diverse support vector machines using adaboost
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Hi-index | 0.00 |
The Support Vector Machines (SVMs) have received great emphasis in the pattern classification due its good ability to generalize. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming. Both the SVMs and the LS-SVMs provide some free parameters that have to be tuned to reflect the requirements of the given task. Despite their high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles. So, in this paper, our proposal is to use both the theory of ensembles and a genetic algorithm to enhance the LS-SVM classification. First, we randomly divide the problem into subspaces to generate diversity among the classifiers of the ensemble. So, we apply a genetic algorithm to optimize the classification of this ensemble of LS-SVM, testing with some benchmark data sets.