Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Machine Learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Empirical characterization of random forest variable importance measures
Computational Statistics & Data Analysis
Robust Feature Selection Using Ensemble Feature Selection Techniques
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Consistency of Random Forests and Other Averaging Classifiers
The Journal of Machine Learning Research
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Subgroup Analysis via Recursive Partitioning
The Journal of Machine Learning Research
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Feature Selection with Ensembles, Artificial Variables, and Redundancy Elimination
The Journal of Machine Learning Research
Auto-Context and Its Application to High-Level Vision Tasks and 3D Brain Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with ensembles of randomized trees: new insights
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Decision forests with oblique decision trees
MICAI'06 Proceedings of the 5th Mexican international conference on Artificial Intelligence
MML inference of oblique decision trees
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Combining randomization and discrimination for fine-grained image categorization
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Foundations and Trends® in Computer Graphics and Vision
An introduction to random forests for multi-class object detection
Proceedings of the 15th international conference on Theoretical Foundations of Computer Vision: outdoor and large-scale real-world scene analysis
MCV'12 Proceedings of the Second international conference on Medical Computer Vision: recognition techniques and applications in medical imaging
Using random forests to diagnose aviation turbulence
Machine Learning
Hi-index | 0.00 |
In his original paper on random forests, Breiman proposed two different decision tree ensembles: one generated from "orthogonal" trees with thresholds on individual features in every split, and one from "oblique" trees separating the feature space by randomly oriented hyperplanes. In spite of a rising interest in the random forest framework, however, ensembles built from orthogonal trees (RF) have gained most, if not all, attention so far. In the present work we propose to employ "oblique" random forests (oRF) built from multivariate trees which explicitly learn optimal split directions at internal nodes using linear discriminative models, rather than using random coefficients as the original oRF. This oRF outperforms RF, as well as other classifiers, on nearly all data sets but those with discrete factorial features. Learned node models perform distinctively better than random splits. An oRF feature importance score shows to be preferable over standard RF feature importance scores such as Gini or permutation importance. The topology of the oRF decision space appears to be smoother and better adapted to the data, resulting in improved generalization performance. Overall, the oRF propose here may be preferred over standard RF on most learning tasks involving numerical and spectral data.