Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Machine Learning
Shape quantization and recognition with randomized trees
Neural Computation
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
2d Object Detection and Recognition: Models, Algorithms, and Networks
2d Object Detection and Recognition: Models, Algorithms, and Networks
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Different Paradigms for Choosing Sequential Reweighting Algorithms
Neural Computation
The Journal of Machine Learning Research
Consistency of Random Forests and Other Averaging Classifiers
The Journal of Machine Learning Research
Bioinformatics
On the Rate of Convergence of the Bagged Nearest Neighbor Estimate
The Journal of Machine Learning Research
Variable selection using random forests
Pattern Recognition Letters
Pairwise meta-rules for better meta-learning-based algorithm ranking
Machine Learning
Hi-index | 0.00 |
Random forests are a scheme proposed by Leo Breiman in the 2000's for building a predictor ensemble with a set of decision trees that grow in randomly selected subspaces of data. Despite growing interest and practical use, there has been little exploration of the statistical properties of random forests, and little is known about the mathematical forces driving the algorithm. In this paper, we offer an in-depth analysis of a random forests model suggested by Breiman (2004), which is very close to the original algorithm. We show in particular that the procedure is consistent and adapts to sparsity, in the sense that its rate of convergence depends only on the number of strong features and not on how many noise variables are present.