Evolutionary computation: toward a new philosophy of machine intelligence
Evolutionary computation: toward a new philosophy of machine intelligence
Ensemble learning via negative correlation
Neural Networks
Learning and Evolution by Minimization of Mutual Information
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Maintaining Population Diversity By Minimizing Mutual Information
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The design of evolutionary multiple classifier system for the classification of microarray data
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
Hi-index | 0.00 |
In practice, two criteria have often been used to stop the evolutionary process in evolving neural network (NN) ensembles. One criterion is to stop the evolution when the maxial generation is reached. The other criterion is to stop the evolution when the evolved NN ensemble, i.e., the whole population, is satisfactory according to a certain evaluation. This paper points out that NN ensembles evolved from these two criteria might not be robust by having different performance. In order to make the evolved NN ensemble more stable, an alternative solution is to combine a number of evolved NN ensembles. Experimental analyses based on n-fold cross-validation have been given to explain why the evolved NN ensembles could be very different and how such difference could disappear or be reduced in the combination.