Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Boosting and other ensemble methods
Neural Computation
Boosting with averaged weight vectors
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Error bounds for aggressive and conservative AdaBoost
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
As shown in the bibliography, training an ensemble of networks is an interesting way to improve the performance with respect to a single network. However there are several methods to construct the ensemble. In this paper we present some new results in a comparison of twenty different methods. We have trained ensembles of 3, 9, 20 and 40 networks to show results in a wide spectrum of values. The results show that the improvement in performance above 9 networks in the ensemble depends on the method but it is usually low. Also, the best method for a ensemble of 3 networks is called “Decorrelated” and uses a penalty term in the usual Backpropagation function to decorrelate the network outputs in the ensemble. For the case of 9 and 20 networks the best method is conservative boosting. And finally for 40 networks the best method is Cels.