Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Role of Combining Rules in Bagging and Boosting
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Boosting in Linear Discriminant Analysis
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Bagging and the Random Subspace Method for Redundant Feature Spaces
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Complexity of Classification Problems and Comparative Advantages of Combined Classifiers
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Ensembles of Learning Machines
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Feature Selection for Ensembles of Simple Bayesian Classifiers
ISMIS '02 Proceedings of the 13th International Symposium on Foundations of Intelligent Systems
Bagging and the Random Subspace Method for Redundant Feature Spaces
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Empirical characterization of random forest variable importance measures
Computational Statistics & Data Analysis
Learn++.MF: A random subspace approach for the missing feature problem
Pattern Recognition
Supervised subspace projections for constructing ensembles of classifiers
Information Sciences: an International Journal
Hi-index | 0.00 |
The performance of a single weak classifier can be improved by using combining techniques such as bagging, boosting and the random subspace method. When applying them to linear discriminant analysis, it appears that they are useful in different situations. Their performance is strongly affected by the choice of the base classifier and the training sample size. As well, their usefulness depends on the data distribution. In this paper, on the example of the pseudo Fisher linear classifier, we study the effect of the redundancy in the data feature set on the performance of the random subspace method and bagging.