Machine Learning
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
An approach to the automatic design of multiple classifier systems
Pattern Recognition Letters - Special issue on machine learning and data mining in pattern recognition
Machine Learning
Cost complexity-based pruning of ensemble classifiers
Knowledge and Information Systems
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Knowledge Acquisition form Examples Vis Multiple Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Clustering ensembles of neural network models
Neural Networks
Cost-conscious classifier ensembles
Pattern Recognition Letters
Switching class labels to generate classification ensembles
Pattern Recognition
Selective ensemble of decision trees
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
Using all data to generate decision tree ensembles
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Non-parametric bootstrap ensembles for detection of tumor lesions
Pattern Recognition Letters
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Boosting One-Class Support Vector Machines for Multi-Class Classification
Applied Artificial Intelligence
A fast ensemble pruning algorithm based on pattern mining process
Data Mining and Knowledge Discovery
The impact of random samples in ensemble classifiers
Proceedings of the 2010 ACM Symposium on Applied Computing
Selection of decision stumps in bagging ensembles
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
Margin distribution based bagging pruning
Neurocomputing
A double pruning algorithm for classification ensembles
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Ensemble pruning for text categorization based on data partitioning
AIRS'11 Proceedings of the 7th Asia conference on Information Retrieval Technology
Expert pruning based on genetic algorithm in regression problems
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part III
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
A competitive ensemble pruning approach based on cross-validation technique
Knowledge-Based Systems
An effective ensemble pruning algorithm based on frequent patterns
Knowledge-Based Systems
Artificial Intelligence Review
Hi-index | 0.10 |
Boosting is used to determine the order in which classifiers are aggregated in a bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered bagging ensemble allows the identification of subensembles that require less memory for storage, classify faster and can improve the generalization accuracy of the original bagging ensemble. In all the classification problems investigated pruned ensembles with 20% of the original classifiers show statistically significant improvements over bagging. In problems where boosting is superior to bagging, these improvements are not sufficient to reach the accuracy of the corresponding boosting ensembles. However, ensemble pruning preserves the performance of bagging in noisy classification tasks, where boosting often has larger generalization errors. Therefore, pruned bagging should generally be preferred to complete bagging and, if no information about the level of noise is available, it is a robust alternative to AdaBoost.