2005 Special Issue: The loading problem for recursive neural networks
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Convergence of Gradient Descent Algorithm for a Recurrent Neuron
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
A modified gradient-based neuro-fuzzy learning algorithm and its convergence
Information Sciences: an International Journal
Convergence of an online gradient method for BP neural networks with stochastic inputs
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Statistical and incremental methods for neural models selection
International Journal of Artificial Intelligence and Soft Computing
Hi-index | 0.00 |
Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns