Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
On the relationship between majority vote accuracy and dependency in multiple classifier systems
Pattern Recognition Letters
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
RSFDGrC'05 Proceedings of the 10th international conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing - Volume Part I
Improved Uniformity Enforcement in Stochastic Discrimination
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Combining bagging, boosting, rotation forest and random subspace methods
Artificial Intelligence Review
Integrating global and local application of random subspace ensemble
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
In recent years, many approaches for achieving high performance by combining some classifiers have been proposed. We exploit many random replicates of samples in the bagging, and randomly chosen feature subsets in the random subspace method. In this paper, we introduce a method for selecting both samples and features at the same time and demonstrate the effectiveness of the method. This method includes a parametric bagging and a parametric random subspace method as special cases. In some experiments, this method and the parametric random subspace method showed the best performance.