Parallel distributed processing: explorations in the microstructure, vol. 2: psychological and biological models
Acceleration Techniques for the Backpropagation Algorithm
Proceedings of the EURASIP Workshop 1990 on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A general backpropagation algorithm for feedforward neural networks learning
IEEE Transactions on Neural Networks
Globally convergent algorithms with local learning rates
IEEE Transactions on Neural Networks
Neighborhood based Levenberg-Marquardt algorithm for neural network training
IEEE Transactions on Neural Networks
New dynamical optimal learning for linear multilayer FNN
IEEE Transactions on Neural Networks
Deterministic convergence of an online gradient method for BP neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
On Adaptive Learning Rate That Guarantees Convergence in Feedforward Networks
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The major drawbacks of backpropagation algorithm are local minima and slow convergence. This paper presents an efficient technique ANMBP for training single hidden layer neural network to improve convergence speed and to escape from local minima. The algorithm is based on modified backpropagation algorithm in neighborhood based neural network by replacing fixed learning parameters with adaptive learning parameters. The developed learning algorithm is applied to several problems. In all the problems, the proposed algorithm outperform well.