Original Contribution: Stacked generalization
Neural Networks
Machine Learning
Randomizing Outputs to Increase Prediction Accuracy
Machine Learning
Machine Learning
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
The Journal of Machine Learning Research
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Switching class labels to generate classification ensembles
Pattern Recognition
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Balanced boosting with parallel perceptrons
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Class-switching neural network ensembles
Neurocomputing
Hi-index | 0.00 |
This article investigates the properties of ensembles of neural networks, in which each network in the ensemble is constructed using a perturbed version of the training data. The perturbation consists in switching the class labels of a subset of training examples selected at random. Experiments on several UCI and synthetic datasets show that these class-switching ensembles can obtain improvements in classification performance over both individual networks and bagging ensembles.