Improving the convergence of the back-propagation algorithm
Neural Networks
Minimisation methods for training feedforward neural networks
Neural Networks
Effective backpropagation training with variable stepsize
Neural Networks
A Neural Network Online Training Algorithm Based on Compound Gradient Vector
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
A generalized learning paradigm exploiting the structure of feedforward neural networks
IEEE Transactions on Neural Networks
Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An improved compound gradient vector based a fast convergent NN online training weight update scheme is proposed in this paper. The convergent analysis indicates that because the compound gradient vector is employed during the weight update, the convergent speed of the presented algorithm is faster than the back propagation (BP) algorithm. In this scheme an adaptive learning factor is introduced in which the global convergence is obtained, and the convergence procedure on plateau and flat bottom area can speed up. Some simulations have been conducted and the results demonstrate the satisfactory convergent performance and strong robustness are obtained using the improved compound gradient vector NN online learning scheme for real time control involving uncertainty parameter plant.