Differential Evolution Training Algorithm for Feed-Forward Neural Networks
Neural Processing Letters
Optimization of Neural Networks Weights and Architecture: A Multimodal Methodology
ISDA '09 Proceedings of the 2009 Ninth International Conference on Intelligent Systems Design and Applications
JADE: adaptive differential evolution with optional external archive
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Automatic Clustering Using an Improved Differential Evolution Algorithm
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
One of the main problems in the training of artificial neural networks is to define their initial weights and architecture. The use of evolutionary algorithms (EAs) to optimize artificial neural networks has been largely used because the EAs can deal with large, non-differentiable, complex and multimodal spaces, and because they are good in finding the optimal region. In this paper we propose the use of Adaptive Differential Evolution (JADE), a new evolutionary algorithm based in the differential evolution (DE), to deal with this problem of training neural networks. Experiments were performed to evaluate the proposed method using machine learning benchmarks for classification problems.