Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Schemata and sequential thought processes in PDP models
Parallel distributed processing: explorations in the microstructure of cognition, vol. 2
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Signal and image processing with neural networks: a C++ sourcebook
Signal and image processing with neural networks: a C++ sourcebook
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Hybrid Intelligent Systems
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Intelligent Hybrid Systems
Connectionist Speech Recognition: A Hybrid Approach
Connectionist Speech Recognition: A Hybrid Approach
Global Multiobjective Optimization Using Evolutionary Algorithms
Journal of Heuristics
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Evolving Time Series Forecasting ARMA Models
Journal of Heuristics
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Flexible neural trees ensemble for stock index modeling
Neurocomputing
Hybrid flexible neural-tree-based intrusion detection systems: Research Articles
International Journal of Intelligent Systems
State-of-the-Art Review: A User's Guide to the Brave New World of Designing Simulation Experiments
INFORMS Journal on Computing
Construction of classifier ensembles by means of artificial immune systems
Journal of Heuristics
Time-series forecasting using flexible neural tree model
Information Sciences: an International Journal
Training neural networks using multiobjective particle swarm optimization
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Evolving artificial neural networks using adaptive differential evolution
IBERAMIA'10 Proceedings of the 12th Ibero-American conference on Advances in artificial intelligence
Fuzzy ARTMAP and hybrid evolutionary programming for pattern classification
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Computers and Industrial Engineering
An automatic method for construction of ensembles to time series prediction
International Journal of Hybrid Intelligent Systems
Hi-index | 0.01 |
The use of artificial neural networks implies considerable time spent choosing a set of parameters that contribute toward improving the final performance. Initial weights, the amount of hidden nodes and layers, training algorithm rates and transfer functions are normally selected through a manual process of trial-and-error that often fails to find the best possible set of neural network parameters for a specific problem. This paper proposes an automatic search methodology for the optimization of the parameters and performance of neural networks relying on use of Evolution Strategies, Particle Swarm Optimization and concepts from Genetic Algorithms corresponding to the hybrid and global search module. There is also a module that refers to local searches, including the well-known Multilayer Perceptrons, Back-propagation and the Levenberg-Marquardt training algorithms. The methodology proposed here performs the search using the aforementioned parameters in an attempt to optimize the networks and performance. Experiments were performed and the results proved the proposed method to be better than trial-and-error and other methods found in the literature.