Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Limiting the Number of Trees in Random Forests
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Machine Learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using Random Forests for Handwritten Digit Recognition
ICDAR '07 Proceedings of the Ninth International Conference on Document Analysis and Recognition - Volume 02
Pattern Recognition Letters
A new random forest method for one-class classification
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Editorial: Modifications of the construction and voting mechanisms of the Random Forests Algorithm
Data & Knowledge Engineering
Hi-index | 0.00 |
In this paper we present our work on the parametrization of Random Forests (RF), and more particularly on the number Kof features randomly selected at each node during the tree induction process. It has been shown that this hyperparameter can play a significant role on performance. However, the choice of the value of Kis usually made either by a greedy search that tests every possible value to choose the optimal one, either by choosing a priorione of the three arbitrary values commonly used in the literature. With this work we show that none of those three values is always better than the others. We thus propose an alternative to those arbitrary choices of Kwith a new "push-button" RF induction method, called Forest-RK, for which Kis not an hyperparameter anymore. Our experimentations show that this new method is at least as statistically accurate as the original RF method with a default Ksetting.