Machine Learning
Soft combination of neural classifiers: a comparative study
Pattern Recognition Letters
Boosting and other ensemble methods
Neural Computation
Hi-index | 0.00 |
Training an ensemble of networks is an interesting way to improve the performance with respect to a single network. However there are several methods to construct the ensemble and there are no complete results showing which one could be the most appropriate. In this paper we present a comparison of eleven different methods. We have trained ensembles of a reduced number of networks (3 and 9) because in this case the computational cost is not high and the method is suitable for applications. The results show that the improvement in performance from three to nine networks is marginal. Also, the best method is called "Decorrelated" and uses a penalty term in the usual Back-propagation function to decorrelate the network outputs in the ensemble.