Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting regression estimators
Neural Computation
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
An approach to the automatic design of multiple classifier systems
Pattern Recognition Letters - Special issue on machine learning and data mining in pattern recognition
Machine Learning
A Theoretical Study on Six Classifier Fusion Strategies
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using Iterated Bagging to Debias Regressions
Machine Learning
Cost complexity-based pruning of ensemble classifiers
Knowledge and Information Systems
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Boosting Methods for Regression
Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Knowledge Acquisition form Examples Vis Multiple Models
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Clustering ensembles of neural network models
Neural Networks
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Boosting the distance estimation
Pattern Recognition Letters
Cost-conscious classifier ensembles
Pattern Recognition Letters
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Experiments with AdaBoost.RT, an improved boosting scheme for regression
Neural Computation
Multi-sensor fusion: an Evolutionary algorithm approach
Information Fusion
Using boosting to prune bagging ensembles
Pattern Recognition Letters
A local boosting algorithm for solving classification problems
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis
Bundling classifiers by bagging trees
Computational Statistics & Data Analysis
Switching class labels to generate classification ensembles
Pattern Recognition
Selective ensemble of decision trees
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Multiclass classification and gene selection with a stochastic algorithm
Computational Statistics & Data Analysis
The impact of random samples in ensemble classifiers
Proceedings of the 2010 ACM Symposium on Applied Computing
Ensemble classification of paired data
Computational Statistics & Data Analysis
Artificial Intelligence Review
Hi-index | 0.03 |
In this paper, Boosting is used to determine the order in which base predictors are aggregated into a Double-Bagging ensemble, and a subensemble is constructed by early stopping the aggregation process based on two heuristic stopping rules. In all the investigated classification and regression problems, the pruned ensembles perform better than or as well as Bagging, Boosting and the full randomly ordered Double-Bagging ensembles in most cases. Therefore, the proposed method may be a good choice for solving the prediction problems at hand when prediction accuracy, prediction speed and storage requirements are all taken into account.