Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
A New BP Algorithm with Adaptive Momentum for FNNs Training
GCIS '09 Proceedings of the 2009 WRI Global Congress on Intelligent Systems - Volume 04
ICIC '07 Proceedings of the 3rd International Conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence
Optimal convergence of on-line backpropagation
IEEE Transactions on Neural Networks
Stability of steepest descent with momentum for quadratic functions
IEEE Transactions on Neural Networks
Deterministic convergence of an online gradient method for BP neural networks
IEEE Transactions on Neural Networks
Convergence of gradient method with momentum for two-Layer feedforward neural networks
IEEE Transactions on Neural Networks
Analysis of the back-propagation algorithm with momentum
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, the convergence of a new back-propagation algorithm with adaptive momentum is analyzed when it is used for training feedforward neural networks with a hidden layer. A convergence theorem is presented and sufficient conditions are offered to guarantee both weak and strong convergence result. Compared with existing results, our convergence result is of deterministic nature and we do not require the error function to be quadratic or uniformly convex.