Neural computing: theory and practice
Neural computing: theory and practice
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Ill-conditioning in neural network training problems
SIAM Journal on Scientific Computing
On Langevin updating in multilayer perceptrons
Neural Computation
Optimization of space structures by neural dynamics
Neural Networks
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Numerical Optimization of Computer Models
Numerical Optimization of Computer Models
Neural Networks
Advances in Engineering Software
An intelligent neural system for predicting structural response subject to earthquakes
Advances in Engineering Software
Improving gradient-based learning algorithms for large scale feedforward networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Neural network based prediction schemes of the non-linear seismic response of 3D buildings
Advances in Engineering Software
AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
Hi-index | 0.00 |
The performance of feed-forward neural networks can be substantially impaired by the ill-conditioning of the corresponding Jacobian matrix. Ill-conditioning appearing in feed-forward learning process is related to the properties of the activation function used. It will be shown that the performance of the network training can be improved using an adaptive activation function with a properly updated gain parameter during the learning process. The efficiency of the proposed adaptive procedure is examined in structural optimization problems where a trained neural network is used to replace the structural analysis phase and capture the necessary data for the optimizer. The optimizer used in this study is an algorithm based on evolution strategies.