Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Machine Learning
Machine Learning
Inference for the Generalization Error
Machine Learning
Nonparametric discriminant analysis and nearest neighbor classification
Pattern Recognition Letters
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
An Optimal Set of Discriminant Vectors
IEEE Transactions on Computers
Application of the Karhunen-Loève Expansion to Feature Selection and Ordering
IEEE Transactions on Computers
Combining feature subsets in feature selection
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Cancer classification using Rotation Forest
Computers in Biology and Medicine
International Journal of Knowledge Engineering and Data Mining
Random projections for SVM ensembles
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part II
Rotation forest on microarray domain: PCA versus ICA
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part II
An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction
Expert Systems with Applications: An International Journal
ACIIDS'11 Proceedings of the Third international conference on Intelligent information and database systems - Volume Part I
Random projections for linear SVM ensembles
Applied Intelligence
Rotation forest with GEP-induced expression trees
KES-AMSTA'11 Proceedings of the 5th KES international conference on Agent and multi-agent systems: technologies and applications
Ensembles of decision trees for imbalanced data
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Computer Methods and Programs in Biomedicine
Using rotation forest for protein fold prediction problem: an empirical study
EvoBIO'10 Proceedings of the 8th European conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
Improving bagging performance through multi-algorithm ensembles
PAKDD'11 Proceedings of the 15th international conference on New Frontiers in Applied Data Mining
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part I
Pattern Recognition
Comparing ensemble learning methods based on decision tree classifiers for protein fold recognition
International Journal of Data Mining and Bioinformatics
Hi-index | 0.00 |
Rotation Forest is a recently proposed method for building classifier ensembles using independently trained decision trees. It was found to be more accurate than bagging, AdaBoost and Random Forest ensembles across a collection of benchmark data sets. This paper carries out a lesion study on Rotation Forest in order to find out which of the parameters and the randomization heuristics are responsible for the good performance. Contrary to common intuition, the features extracted through PCA gave the best results compared to those extracted through non-parametric discriminant analysis (NDA) or random projections. The only ensemble method whose accuracy was statistically indistinguishable from that of Rotation Forest was LogitBoost although it gave slightly inferior results on 20 out of the 32 benchmark data sets. It appeared that the main factor for the success of Rotation Forest is that the transformation matrix employed to calculate the (linear) extracted features is sparse.