A Neural Network Online Training Algorithm Based on Compound Gradient Vector
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
An improved compound gradient vector based neural network on-line training algorithm
IEA/AIE'2003 Proceedings of the 16th international conference on Developments in applied artificial intelligence
ANN-based estimator for distillation using Levenberg-Marquardt approach
Engineering Applications of Artificial Intelligence
A tabu based neural network learning algorithm
Neurocomputing
Application of feedforward neural network in the study of dissociated gas flow along the porous wall
Expert Systems with Applications: An International Journal
An improved training algorithm for feedforward neural network learning based on terminal attractors
Journal of Global Optimization
Trawling pattern analysis with neural classifier
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
Hi-index | 0.00 |
We introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms the basic backpropagation and its variations with variable learning rate significantly, but with higher computation and memory complexities within each iteration. The mew method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm