What size net gives valid generalization?
Neural Computation
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Genetic algorithms + data structures = evolution programs (3rd ed.)
Genetic algorithms + data structures = evolution programs (3rd ed.)
Designing Neural Networks using Genetic Algorithms
Proceedings of the 3rd International Conference on Genetic Algorithms
The dependence identification neural network construction algorithm
IEEE Transactions on Neural Networks
Computers and Operations Research
Employee turnover: a neural network solution
Computers and Operations Research
Machine assessment of neonatal facial expressions of acute pain
Decision Support Systems
Variable step search algorithm for feedforward networks
Neurocomputing
Hi-index | 0.00 |
A major limitation to current artificial neural network (NN) research is the inability to adequately identify unnecessary weights in the solution. If a method were found that would allow unnecessary weights to be identified, decision-makers would gain crucial information about the problem at hand as well as benefit by having a network that was more effective and efficient. The Neural Network Simultaneous Optimization Algorithm (NNSOA) is proposed for supervised training in multilayer feedforward neural networks. We demonstrate with Monte Carlo studies that the NNSOA can be used to obtain both a global solution and simultaneously identify a parsimonious network structure.