C4.5: programs for machine learning
C4.5: programs for machine learning
A statistical approach to decision tree modeling
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Randomizing Outputs to Increase Prediction Accuracy
Machine Learning
Automatic Learning Techniques in Power Systems
Automatic Learning Techniques in Power Systems
Machine Learning
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
Inference for the Generalization Error
Machine Learning
A complete fuzzy decision tree technique
Fuzzy Sets and Systems - Theme: Learning and modeling
Hi-index | 0.00 |
This paper studies the aggregation of predictions made by tree-based models for several perturbed versions of the attribute vector of a test case. A closed-form approximation of this scheme combined with cross-validation to tune the level of perturbation is proposed. This yields soft-tree models in a parameter free way. and preserves their interpretability. Empirical evaluations, on classification and regression problems, show that accuracy and bias/variance tradeoff are improved significantly at the price of an acceptable computational overhead. The method is further compared and combined with tree bagging.