Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Improving the convergence of the back-propagation algorithm
Neural Networks
Principles of Neurocomputing for Science and Engineering
Principles of Neurocomputing for Science and Engineering
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
A neural root finder of polynomials based on root moments
Neural Computation
The local minima-free condition of feedforward neural networks forouter-supervised learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Improving the error backpropagation algorithm with a modified error function
IEEE Transactions on Neural Networks
Comments on “Accelerated learning algorithm for multilayer perceptrons: optimization layer by layer”
IEEE Transactions on Neural Networks
A new error function at hidden layers for past training of multilayer perceptrons
IEEE Transactions on Neural Networks
A constructive approach for finding arbitrary roots of polynomials by neural networks
IEEE Transactions on Neural Networks
Zeroing polynomials using modified constrained neural network approach
IEEE Transactions on Neural Networks
An accelerated learning algorithm for multilayer perceptrons: optimization layer by layer
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper points out some drawbacks and proposes some modifications to the conventional layer-by-layer BP algorithm. In particular, we present a new perspective to the learning rate, which is to use a heuristic rule to define the learning rate so as to update the weights. Meanwhile, to pull the algorithm out of saturation area and prevent it from converging to a local minimum, a momentum term is introduced to the former algorithm. And finally the effectiveness and efficiency of the proposed method are demonstrated by two benchmark examples.