The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting of Support Vector Machines with Application to Editing
ICMLA '05 Proceedings of the Fourth International Conference on Machine Learning and Applications
An algorithm on multi-view adaboost
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
RSGALS-SVM: random subspace method applied to a LS-SVM ensemble optimized by genetic algorithm
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Random subspace method and genetic algorithm applied to a LS-SVM ensemble
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.01 |
Support vector machines are one of the most employed methods of pattern classification, and the Adaboost algorithm is an effective way of improving the performance of the weak learners that compose the ensemble. In this article, we propose to create an Adaboost-based ensemble of SVM, by altering the Gaussian width parameter of the RBF-SVM. Using data sets from the UCI repository, we made tests to evaluate the algorithm.