C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Artificial intelligence: a new synthesis
Artificial intelligence: a new synthesis
A Theoretical Study on Six Classifier Fusion Strategies
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Shared Ensemble Learning Using Multi-trees
IBERAMIA 2002 Proceedings of the 8th Ibero-American Conference on AI: Advances in Artificial Intelligence
SMILES: A Multi-purpose Learning System
JELIA '02 Proceedings of the European Conference on Logics in Artificial Intelligence
Option Decision Trees with Majority Votes
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Multi-paradigm learning of declarative models: Thesis
AI Communications
Boosting recombined weak classifiers
Pattern Recognition Letters
MLDM'03 Proceedings of the 3rd international conference on Machine learning and data mining in pattern recognition
Hi-index | 0.00 |
Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. However, there is an important shortcoming associated with ensemble methods. Huge amounts of memory are required to store a set of multiple hypotheses. In this work, we have devised an ensemble method that partially solves this problem. The key point is that components share their common parts. We employ a multi-tree, which is a structure that can simultaneously contain an ensemble of decision trees but has the advantage that decision trees share some conditions. To construct this multi-tree, we define an algorithm based on a beam search with several extraction criteria and with several forgetting policies for the suspended nodes. Finally, we compare the behaviour of this ensemble method with some well-known methods for generating hypothesis ensembles.