Inferring decision trees using the minimum description length principle
Information and Computation
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Machine Learning
Refining decision tree classifiers using rough set tools
International Journal of Hybrid Intelligent Systems - Hybrid Intelligence using rough sets
Rough set based approach for inducing decision trees
Knowledge-Based Systems
A new method for discretization of continuous attributes based on VPRS
RSCTC'06 Proceedings of the 5th international conference on Rough Sets and Current Trends in Computing
Hi-index | 0.00 |
Bootstrap, boosting and subspace are popular techniques for inducing decision forests. In all the techniques, each single decision tree is induced in the same way as that for inducing a decision tree on the whole data, in which all possible classes are dealt with together. In such induced trees, some minority classes may be covered up by others when some branches grow or are pruned. For a multi-class problem, this paper proposes to induce individually the 1-vs-others rough decision trees for all classes, and finally construct a rough decision forest, intending to reduce the possible side effects of imbalanced class distribution. Since all training samples are reused to construct the rough decision trees for all classes, the method also tends to have the merits of bootstrap, boosting and subspace. Experimental results and comparisons on some hard gene expression data show the attractiveness of the method.