Hierarchical mixtures of experts and the EM algorithm
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Machine Learning
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
IEEE Transactions on Pattern Analysis and Machine Intelligence
Forecasting time series with genetic fuzzy predictor ensemble
IEEE Transactions on Fuzzy Systems
Demonstrating the stability of support vector machines for classification
Signal Processing - Signal processing in UWB communications
Classification of face images using local iterated function systems
Machine Vision and Applications
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
Human splice site identification with multiclass support vector machines and bagging
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
A MapReduce-based distributed SVM ensemble for scalable image classification and annotation
Computers & Mathematics with Applications
A boosted SVM based ensemble classifier for sentiment analysis of online reviews
ACM SIGAPP Applied Computing Review
Random subspace support vector machine ensemble for reliable face recognition
International Journal of Biometrics
Hi-index | 0.00 |
Even the support vector machine (SVM) has been proposed to provide a good generalization performance, the classification result of the practically implemented SVM is often far from the theoretically expected level because their implementations are based on the approximated algorithms due to the high complexity of time and space. To improve the limited classification performance of the real SVM, we propose to use the SVM ensembles with bagging (bootstrap aggregating). Each individual SVM is trained independently using the randomly chosen training samples via a bootstrap technique. Then, they are aggregated into to make a collective decision in several ways such as the majority voting, the LSE(least squares estimation)-based weighting, and the double-layer hierarchical combining. Various simulation results for the IRIS data classification and the hand-written digit recognitionshow that the proposed SVM ensembles with bagging outperforms a single SVM in terms of classification accuracy greatly.