The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Hierarchical annotation of medical images
Pattern Recognition
Hybrid decision tree architecture utilizing local SVMs for multi-label classification
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
Hi-index | 0.00 |
Boosting has been shown to improve the predictive performance of unstable learners such as decision trees, but not of stable learners like support vector machines (SVM). In addition to the model stability problem, the high computational cost of SVM prohibits it from generating multiple models to form an ensemble for large data sets. This paper introduces a method that not only enables boosting to improve the predictive performance of SVM, but also reduces the computational cost to make ensembles of SVM feasible for large data sets. The method proposes to build local models, instead of global models; and it is the first method, to the best of our knowledge, to solve the two problems in boosting SVM at the same time. The proposed method to boost SVM also performs better than boosting decision trees in term of predictive accuracy in our experiments.