The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combining One-Class Classifiers
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Machine Learning
One-Class Classification by Combining Density and Class Probability Estimation
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Forest-RK: A New Random Forest Induction Method
ICIC '08 Proceedings of the 4th international conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications - with Aspects of Artificial Intelligence
Influence of Hyperparameters on Random Forest Accuracy
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
A survey of recent trends in one class classification
AICS'09 Proceedings of the 20th Irish conference on Artificial intelligence and cognitive science
Pattern Recognition
Hi-index | 0.00 |
We propose a new one-class classification method, called One Class Random Forest, that is able to learn from one class of samples only. This method, based on a random forest algorithm and an original outlier generation procedure, makes use of the ensemble learning mechanisms offered by random forest algorithms to reduce both the number of artificial outliers to generate and the size of the feature space in which they are generated. We show that One Class Random Forests perform well on various UCI public datasets in comparison to few other state-of-the-art one class classification methods (gaussian density models, Parzen estimators, gaussian mixture models and one-class SVMs).