Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Machine Learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosted mixture of experts: an ensemble learning scheme
Neural Computation
The Role of Combining Rules in Bagging and Boosting
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Bagging and the Random Subspace Method for Redundant Feature Spaces
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Combining Fisher Linear Discriminants for Dissimilarity Representations
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Hi-index | 0.04 |
In recent years, together with bagging [5] and the random subspace method [15], boosting [6] became one of the most popular combining techniques that allows us to improve a weak classifier. Usually, boosting is applied to Decision Trees (DT's). In this paper, we study boosting in Linear Discriminant Analysis (LDA). Simulation studies, carried out for one artificial data set and two real data sets, show that boosting might be useful in LDA for large training sample sizes while bagging is useful for critical training sample sizes [11]. In this paper, in contrast to a common opinion, we demonstrate that the usefulness of boosting does not depend on the instability of a classifier.