Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Constrained supervised learning
Journal of Mathematical Psychology
On the Need for a Neural Abstract Machine
Sequence Learning - Paradigms, Algorithms, and Applications
A smoothing regularizer for feedforward and recurrent neural networks
Neural Computation
Convergence of Gradient Descent Algorithm for a Recurrent Neuron
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Settable Systems: An Extension of Pearl's Causal Model with Optimization, Equilibrium, and Learning
The Journal of Machine Learning Research
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We give a rigorous analysis of the convergence properties of a backpropagation algorithm for recurrent networks containing either output or hidden layer recurrence. The conditions permit data generated by stochastic processes with considerable dependence. Restrictions are offered that may help assure convergence of the network parameters to a local optimum, as some simulations illustrate.