Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms
Ensemble learning via negative correlation
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Numerical Optimization of Computer Models
Numerical Optimization of Computer Models
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Ensembling neural networks: many could be better than all
Artificial Intelligence
Genetic algorithm based selective neural network ensemble
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Making use of population information in evolutionary artificialneural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The formation of ensemble of artificial neural networks has attracted attention of researchers in the machine learning and statistical inference domains. It has been shown that combining different neural networks can improve the generalization ability of the learning machine. One challenge is when to stop the training or evolution of the neural networks to avoid overfitting. In the literature on ensembles of Evolutionary Artificial Neural Networks (EANNs), researchers often use the surviving population at the last generation to form the ensemble. In this paper, we show that the ensemble constructed from populations given by different early stopping criteria: (i) the minimum validation fitness of the ensemble, and (ii) the minimum of the average population validation fitness, can generalize better than the ensemble of the population in the last generation. The proposition was tested on ensembles whose members are differentiated by two diversity mechanisms: (i) using negative correlation learning and (ii) using island model. The experimental results suggested that using minimum validation fitness of the ensemble as an early stopping criterion performs significantly (with 99% confidence) better than using the population in the last generation for three (using NCL) and four (using island model) out of the five datasets.