Advances in neural information processing systems 2
Constructing hidden units using examples and queries
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Practical neural network recipes in C++
Practical neural network recipes in C++
Computational intelligence PC tools
Computational intelligence PC tools
Learning experiments with genetic optimization of a generalized regression neural network
Decision Support Systems - Special double issue: unified programming
Reliable classification using neural networks: a genetic algorithm and backpropagation comparison
Decision Support Systems
Fundamentals of Artificial Neural Networks
Fundamentals of Artificial Neural Networks
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
FANNC: a fast adaptive neural network classifier
Knowledge and Information Systems
Alternative Neural Network Training Methods
IEEE Expert: Intelligent Systems and Their Applications
Designing Neural Networks using Genetic Algorithms
Proceedings of the 3rd International Conference on Genetic Algorithms
Using the particle swarm optimization technique to train a recurrent neural model
ICTAI '97 Proceedings of the 9th International Conference on Tools with Artificial Intelligence
Dissipative particle swarm optimization
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
Extending particle swarm optimisers with self-organized criticality
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
An Improved Particle Swarm Optimization for Evolving Feedforward Artificial Neural Networks
Neural Processing Letters
Particle Swarm Optimization of Neural Network Architectures andWeights
HIS '07 Proceedings of the 7th International Conference on Hybrid Intelligent Systems
An overview of evolutionary algorithms for parameter optimization
Evolutionary Computation
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
An evolutionary algorithm that constructs recurrent neural networks
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Learning in the feed-forward random neural network: A critical review
Performance Evaluation
Expert Systems with Applications: An International Journal
Classification and retrieval on macroinvertebrate image databases
Computers in Biology and Medicine
Particle swarm algorithm with hybrid mutation strategy
Applied Soft Computing
Evolutionary RBF classifier for polarimetric SAR images
Expert Systems with Applications: An International Journal
A hybrid algorithm for artificial neural network training
Engineering Applications of Artificial Intelligence
G-HABC Algorithm for Training Artificial Neural Networks
International Journal of Applied Metaheuristic Computing
Global artificial bee colony algorithm for boolean function classification
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Hybrid PSO and GA for neural network evolutionary in monthly rainfall forecasting
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part I
Hybird evolutionary algorithms for artificial neural network training in rainfall forecasting
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part II
Meta-heuristic algorithms for optimized network flow wavelet-based image coding
Applied Soft Computing
Engineering Applications of Artificial Intelligence
Evolving multilayer feedforward neural network using adaptive particle swarm algorithm
International Journal of Hybrid Intelligent Systems
Hi-index | 0.01 |
In this paper, we propose a novel technique for the automatic design of Artificial Neural Networks (ANNs) by evolving to the optimal network configuration(s) within an architecture space. It is entirely based on a multi-dimensional Particle Swarm Optimization (MD PSO) technique, which re-forms the native structure of swarm particles in such a way that they can make inter-dimensional passes with a dedicated dimensional PSO process. Therefore, in a multidimensional search space where the optimum dimension is unknown, swarm particles can seek both positional and dimensional optima. This eventually removes the necessity of setting a fixed dimension a priori, which is a common drawback for the family of swarm optimizers. With the proper encoding of the network configurations and parameters into particles, MD PSO can then seek the positional optimum in the error space and the dimensional optimum in the architecture space. The optimum dimension converged at the end of a MD PSO process corresponds to a unique ANN configuration where the network parameters (connections, weights and biases) can then be resolved from the positional optimum reached on that dimension. In addition to this, the proposed technique generates a ranked list of network configurations, from the best to the worst. This is indeed a crucial piece of information, indicating what potential configurations can be alternatives to the best one, and which configurations should not be used at all for a particular problem. In this study, the architecture space is defined over feed-forward, fully-connected ANNs so as to use the conventional techniques such as back-propagation and some other evolutionary methods in this field. The proposed technique is applied over the most challenging synthetic problems to test its optimality on evolving networks and over the benchmark problems to test its generalization capability as well as to make comparative evaluations with the several competing techniques. The experimental results show that the MD PSO evolves to optimum or near-optimum networks in general and has a superior generalization capability. Furthermore, the MD PSO naturally favors a low-dimension solution when it exhibits a competitive performance with a high dimension counterpart and such a native tendency eventually yields the evolution process to the compact network configurations in the architecture space rather than the complex ones, as long as the optimality prevails.