Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Nonparametric discriminant analysis and nearest neighbor classification
Pattern Recognition Letters
Immune network based ensembles
Neurocomputing
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Boosting multiple classifiers constructed by hybrid discriminant analysis
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Evolutionary discriminant analysis
IEEE Transactions on Evolutionary Computation
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A general regression neural network
IEEE Transactions on Neural Networks
Supervised subspace projections for constructing ensembles of classifiers
Information Sciences: an International Journal
A noise-detection based AdaBoost algorithm for mislabeled data
Pattern Recognition
Wind turbines fault diagnosis using ensemble classifiers
ICDM'12 Proceedings of the 12th Industrial conference on Advances in Data Mining: applications and theoretical aspects
Hybrid extreme rotation forest
Neural Networks
Hi-index | 12.05 |
In this paper, we propose an approach for ensemble construction based on the use of supervised projections, both linear and non-linear, to achieve both accuracy and diversity of individual classifiers. The proposed approach uses the philosophy of boosting, putting more effort on difficult instances, but instead of learning the classifier on a biased distribution of the training set, it uses misclassified instances to find a supervised projection that favors their correct classification. We show that supervised projection algorithms can be used for this task. We try several known supervised projections, both linear and non-linear, in order to test their ability in the present framework. Additionally, the method is further improved introducing concepts from oversampling for imbalance datasets. The introduced method counteracts the negative effect of a low number of instances for constructing the supervised projections. The method is compared with AdaBoost showing an improved performance on a large set of 45 problems from the UCI Machine Learning Repository. Also, the method shows better robustness in presence of noise with respect to AdaBoost.