The Strength of Weak Learnability
Machine Learning
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Bounds on the Generalization Performance of Kernel Machine Ensembles
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Brief Introduction to Boosting
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Support Vector Machinery for Infinite Ensemble Learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
The performance of support vector machines (SVMs) greatly depends on the used values for hyperparameters. The tuning of hyper-parameters is a time-consuming task especially when the amount of data is large. In this paper, in order to overcome this difficulty, ensemble learning methods based on bagging and boosting are proposed. The proposed bagging methods reduce the computation time while keeping a reasonable accuracy. The proposed boosting method improves the accuracy of a conventional SVM classifier. The effectiveness of the proposed methods are demonstrated by numerical simulations.