Improving the convergence of the back-propagation algorithm
Neural Networks
Minimisation methods for training feedforward neural networks
Neural Networks
Effective backpropagation training with variable stepsize
Neural Networks
A generalized learning paradigm exploiting the structure of feedforward neural networks
IEEE Transactions on Neural Networks
Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency
IEEE Transactions on Neural Networks
An improved compound gradient vector based neural network on-line training algorithm
IEA/AIE'2003 Proceedings of the 16th international conference on Developments in applied artificial intelligence
Hi-index | 0.00 |
A new neural network online training weight update scheme based on the use of a compound gradient vector is presented in this paper. The convergent analysis indicates that because the compound gradient vector is employed during the weight update, the convergent speed of the presented algorithm is faster than the standard BP algorithm. The comprehensive parameter adaptation and the saturation compensation approaches that are introduced in the scheme enhance convergent performance. Several simulations have been conducted and the results demonstrate the satisfactory convergent performance and strong robustness obtained using the improved neural networks online learning scheme for real time control involving uncertainty parameters.