The cascade-correlation learning architecture
Advances in neural information processing systems 2
Generalization and parameter estimation in feedforward nets: some experiments
Advances in neural information processing systems 2
Neural networks and the bias/variance dilemma
Neural Computation
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Modeling with constructive backpropagation
Neural Networks
C-Net: a method for generating non-deterministic and dynamic multivariate decision trees
Knowledge and Information Systems
Speeding up backpropagation using multiobjective evolutionary algorithms
Neural Computation
On three intelligent systems: dynamic neural, fuzzy, and wavelet networks for training trajectory
Neural Computing and Applications
A novel evolutionary neural learning algorithm
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
IEEE Transactions on Neural Networks
Trajectory priming with dynamic fuzzy networks in nonlinear optimal control
IEEE Transactions on Neural Networks
Mutation-based genetic neural network
IEEE Transactions on Neural Networks
Constructive feedforward neural networks using Hermite polynomial activation functions
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
An evolutionary memetic algorithm for rule extraction
Expert Systems with Applications: An International Journal
A novel approach to classificatory problem using neuro-fuzzy architecture
International Journal of Systems, Control and Communications
Hi-index | 0.01 |
In this paper, a novel evolutionary algorithm (EA) based on a newly formulated parameter, i.e., growth probability (P"g) is used to evolve the near optimal weights and the number of hidden neurons in neural networks (NNs). Training NNs with growth probability based evolution (NN-GP) initializes networks with only one hidden neuron and the networks are allowed to grow until a suitable size. Growing of neurons is not restricted to one hidden neuron at a time as the optimal number of hidden neurons for the NNs might be a few neurons more than what it represents now. If this solution in the search space is far, networks have to add several number of hidden neurons. Growth rate is based on Gaussian distribution thus providing a way to escape local optima. A self-adaptive version (NN-SAGP) with the aim of evolving the growth probability in parallel with NNs during each generation is also proposed. The evolved networks are applied to widely used real-world benchmark problems. Simulation results show that the proposed approach is effective for evolving NNs with good classification accuracy and low complexity.