The nature of statistical learning theory
The nature of statistical learning theory
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine Learning for the Detection of Oil Spills in Satellite Radar Images
Machine Learning - Special issue on applications of machine learning and the knowledge discovery process
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Membership authentication in the dynamic group by face classification using SVM ensemble
Pattern Recognition Letters
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Multistrategy Ensemble Learning: Reducing Error by Combining Ensemble Learning Techniques
IEEE Transactions on Knowledge and Data Engineering
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
Neural Computation
Improved prediction of bacterial transcription start sites
Bioinformatics
A Comparison of Decision Tree Ensemble Creation Techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
Expert Systems with Applications: An International Journal
An efficient modified boosting method for solving classification problems
Journal of Computational and Applied Mathematics
An experimental bias-variance analysis of SVM ensembles based on resampling techniques
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Several SVM Ensemble Methods Integrated with Under-Sampling for Imbalanced Data Learning
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
Mining data with random forests: A survey and results of new tests
Pattern Recognition
Screening web breaks in a pressroom by soft computing
Applied Soft Computing
Integrated expert system applied to the analysis of non-technical losses in power utilities
Expert Systems with Applications: An International Journal
Class imbalance methods for translation initiation site recognition in DNA sequences
Knowledge-Based Systems
Expert Systems with Applications: An International Journal
ADMA'11 Proceedings of the 7th international conference on Advanced Data Mining and Applications - Volume Part II
Efficient pairwise classification using local cross off strategy
Canadian AI'12 Proceedings of the 25th Canadian conference on Advances in Artificial Intelligence
BICS'13 Proceedings of the 6th international conference on Advances in Brain Inspired Cognitive Systems
Genetic algorithm-based heuristic for feature selection in credit risk assessment
Expert Systems with Applications: An International Journal
Hi-index | 12.06 |
Ensemble classification - combining the results of a set of base learners - has received much attention in the machine learning community and has demonstrated promising capabilities in improving classification accuracy. Compared with neural network or decision tree ensembles, there is no comprehensive empirical research in support vector machine (SVM) ensembles. To fill this void, this paper analyses and compares SVM ensembles with four different ensemble constructing techniques, namely bagging, AdaBoost, Arc-X4 and a modified AdaBoost. Twenty real-world data sets from the UCI repository are used as benchmarks to evaluate and compare the performance of these SVM ensemble classifiers by their classification accuracy. Different kernel functions and different numbers of base SVM learners are tested in the ensembles. The experimental results show that although SVM ensembles are not always better than a single SVM, the SVM bagged ensemble performs as well or better than other methods with a relatively higher generality, particularly SVMs with a polynomial kernel function. Finally, an industrial case study of gear defect detection is conducted to validate the empirical analysis results.