On the Problem of Local Minima in Backpropagation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembling neural networks: many could be better than all
Artificial Intelligence
Journal of Global Optimization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembles of Learning Machines
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Hi-index | 0.00 |
The Neural-Network Ensemble (NNE) is a very effective method where the outputs of separately trained neural networks are combined to perform the prediction. In this paper, we introduce the improved Neural Network Ensemble (INNE) in which each component forward neural network (FNN) is optimized by particle swarm optimization (PSO) and back-propagation (BP) algorithm. At the same time, the ensemble weights are trained by Particle Swarm Optimization and Differential Evolution cooperative algorithm(PSO-DE). We take two obviously different populations to construct our algorithm, in which one population is trained by PSO and the other is trained by DE. In addition, we incorporate the fitness value from last iteration into the velocity updating to enhance the global searching ability. Our experiments demonstrate that the improved NNE is superior to existing popular NNE.