Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Boosting and other ensemble methods
Neural Computation
Hi-index | 0.00 |
When an ensemble of neural networks is designed, it is necessary to provide enough diversity to the different networks in the ensemble. We propose in this paper one new method of providing diversity which consists on reordering the training set patterns during the training. In this paper three different algorithms are applied in the process of building ensembles with Simple Ensembleand Cross-Validation. The first method consists in using the original training set, in the second one the training set is reordered before the training algorithm is applied and the third one consists in reordering the training set at the beginning of each iteration of BackPropagation. With the experiments proposed we want to empirically demonstrate that reordering patterns during the training is a valid source to provide diversity to the networks of an ensemble. The results show that the performance of the original ensemble methods can be improved by reordering the patterns during the training. Moreover, this new source of diversity can be extended to more complex ensemble methods.