GFAM: Evolving Fuzzy ARTMAP neural networks
Neural Networks
Evolving neural networks for static single-position automated trading
Journal of Artificial Evolution and Applications - Regular issue
Maintaining diversity through adaptive selection, crossover and mutation
Proceedings of the 10th annual conference on Genetic and evolutionary computation
Orthogonal Least Squares Based on QR Decomposition for Wavelet Networks
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Part II--Advances in Neural Networks
An Evolutionary Multi-objective Neural Network Optimizer with Bias-Based Pruning Heuristic
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
AG-ART: An adaptive approach to evolving ART architectures
Neurocomputing
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
Group search optimizer: an optimization algorithm inspired by animal searching behavior
IEEE Transactions on Evolutionary Computation
Evolutionary Fuzzy ARTMAP Neural Networks and their Applications to Fault Detection and Diagnosis
Neural Processing Letters
Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks
IEEE Transactions on Neural Networks
Automatic induction of projection pursuit indices
IEEE Transactions on Neural Networks
Fuzzy ARTMAP and hybrid evolutionary programming for pattern classification
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Hybrid artificial neural networks: models, algorithms and data
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
Genetic algorithm-based artificial neural network for voltage stability assessment
Advances in Artificial Neural Systems
Training neural networks using multiobjective particle swarm optimization
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
A genetic algorithm for constructing wavelet neural networks
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Invasive connectionist evolution
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part III
A group search optimizer for neural network training
ICCSA'06 Proceedings of the 2006 international conference on Computational Science and Its Applications - Volume Part III
Artificial neural network training using a new efficient optimization algorithm
Applied Soft Computing
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
Evolving gradient-learning artificial neural networks (ANNs) using an evolutionary algorithm (EA) is a popular approach to address the local optima and design problems of ANN. The typical approach is to combine the strength of backpropagation (BP) in weight learning and EA's capability of searching the architecture space. However, the BP's "gradient descent" approach requires a highly computer-intensive operation that relatively restricts the search coverage of EA by compelling it to use a small population size. To address this problem, we utilized mutation-based genetic neural network (MGNN) to replace BP by using the mutation strategy of local adaptation of evolutionary programming (EP) to effect weight learning. The MGNN's mutation enables the network to dynamically evolve its structure and adapt its weights at the same time. Moreover, MGNN's EP-based encoding scheme allows for a flexible and less restricted formulation of the fitness function and makes fitness computation fast and efficient. This makes it feasible to use larger population sizes and allows MGNN to have a relatively wide search coverage of the architecture space. MGNN implements a stopping criterion where overfitness occurrences are monitored through "sliding-windows" to avoid premature learning and overlearning. Statistical analysis of its performance to some well-known classification problems demonstrate its good generalization capability. It also reveals that locally adapting or scheduling the strategy parameters embedded in each individual network may provide a proper balance between the local and global searching capabilities of MGNN.