Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Editorial: Computational statistics within clinical research
Computational Statistics & Data Analysis
Binary trees for dissimilarity data
Computational Statistics & Data Analysis
Hi-index | 0.03 |
Bagging and random forests are widely used ensemble methods. Each forms an ensemble of models by randomly perturbing the fitting of a base learner. The standard errors estimation of the resultant regression function is considered. Three estimators are discussed. One, based on the jackknife, is applicable to bagged estimators and can be computed using the bagged ensemble. The two other estimators target the bootstrap standard error estimator, and require fitting multiple ensemble estimators, one for each bootstrap sample. It is shown that these bootstrap ensemble sizes can be small, which reduces the computation involved in forming the estimator. The estimators are studied using both simulated and real data.