Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Decision Tree Grafting From the All Tests But One Partition
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Inference for the Generalization Error
Machine Learning
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Boosting recombined weak classifiers
Pattern Recognition Letters
Hi-index | 0.00 |
Grafted trees are trees that are constructed using two methods. The first method creates an initial tree, while the second method is used to complete the tree. In this work, the first classifier is an unpruned tree from a 10% sample of the training data. Grafting is a method for constructing ensembles of decision trees, where each tree is a grafted tree. Grafting by itself is better than Bagging. Moreover, grafted trees can also be used with any other ensemble method. It is clearly beneficial for Bagging and Random Forests. When using grafted trees with Boosting, the results depends of the considered variant. The best overall method is Grafted Random Forest.