Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Decision Fusion on Boosting Ensembles
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
Researching on Multi-net Systems Based on Stacked Generalization
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
New Results on Combination Methods for Boosting Ensembles
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Stacking MF networks to combine the outputs provided by RBF networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Hi-index | 0.00 |
The two key factors to design an ensemble of neural networks are how to train the individual networks and how to combine the different outputs to get a single output. In this paper we focus on the combination module. We have proposed two methods based on Stacked Generalization as the combination module of an ensemble of neural networks. In this paper we have performed a comparison among the two versions of Stacked Generalization and six statistical combination methods in order to get the best combination method. We have used the mean increase of performance and the mean percentage or error reduction for the comparison. The results show that the methods based on Stacked Generalization are better than classical combiners.