Ensembling neural networks: many could be better than all
Artificial Intelligence
Towards Designing Neural Network Ensembles by Evolution
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
A parallel learning approach for neural network ensemble
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
A constructive algorithm for training cooperative neural network ensembles
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Neural network ensemble (NNE) focuses on two aspects: how to generate component NNs and how to ensemble. The two interplayed aspects impact greatly on performance of NNE. Unfortunately, the two aspects were investigated separately in almost previous works. An integrated neural network ensemble (InNNE) is proposed in the paper, which was an integrated ensemble algorithm not only for dynamically adjusting weights of an ensemble, but also for generating component NNs based on clustering technology. InNNE classifies the training set into different subsets with clustering technology, which are used to train different component NNs. The weights of an ensemble are adjusted by the correlation of input data and the center of different training subsets. InNNE can increase the diversity of component NNs and decreases generalization error of ensemble. The paper provided both analytical and experimental evidence that support the novel algorithm.