Original Contribution: Stacked generalization
Neural Networks
Machine Learning
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Boosting a Strong Learner: Evidence Against the Minimum Margin
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Improving Performance in Neural Networks Using a Boosting Algorithm
Advances in Neural Information Processing Systems 5, [NIPS Conference]
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Pose Invariant Face Recognition
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Ensembles as a sequence of classifiers
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Issues in stacked generalization
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Lung cancer cell identification based on artificial neural network ensembles
Artificial Intelligence in Medicine
Stability problems with artificial neural networks and the ensemble solution
Artificial Intelligence in Medicine
SOM Ensemble-Based Image Segmentation
Neural Processing Letters
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Using boosting to prune bagging ensembles
Pattern Recognition Letters
Collective-agreement-based pruning of ensembles
Computational Statistics & Data Analysis
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Focused Ensemble Selection: A Diversity-Based Method for Greedy Ensemble Selection
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
Decision Templates Based RBF Network for Tree-Structured Multiple Classifier Fusion
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Computational Statistics & Data Analysis
Artificial Intelligence Review
Selection of decision stumps in bagging ensembles
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Local decision bagging of binary neural classifiers
Canadian AI'08 Proceedings of the Canadian Society for computational studies of intelligence, 21st conference on Advances in artificial intelligence
Ensemble pruning via individual contribution ordering
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Combining active learning and semi-supervised for improving learning performance
Proceedings of the 4th International Symposium on Applied Sciences in Biomedical and Communication Technologies
On the use of selective ensembles for relevance classification in case-based web search
ECCBR'06 Proceedings of the 8th European conference on Advances in Case-Based Reasoning
Ensemble pruning using harmony search
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
A competitive ensemble pruning approach based on cross-validation technique
Knowledge-Based Systems
Pruning GP-Based classifier ensembles by bayesian networks
PPSN'12 Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part I
Margin-based ordered aggregation for ensemble pruning
Pattern Recognition Letters
Decision trees: a recent overview
Artificial Intelligence Review
Malware detection by pruning of parallel ensembles using harmony search
Pattern Recognition Letters
Using Bayesian networks for selecting classifiers in GP ensembles
Information Sciences: an International Journal
Hi-index | 0.00 |
An ensemble is generated by training multiple component learners for a same task and then combining their predictions. In most ensemble algorithms, all the trained component learners are employed in constituting an ensemble. But recently, it has been shown that when the learners are neural networks, it may be better to ensemble some instead of all of the learners. In this paper, this claim is generalized to situations where the component learners are decision trees. Experiments show that ensembles generated by a selective ensemble algorithm, which selects some of the trained C4.5 decision trees to make up an ensemble, may be not only smaller in the size but also stronger in the generalization than ensembles generated by non-selective algorithms.