On the numerical condition of polynomials in Berstein form
Computer Aided Geometric Design
Neural Computation
Recurrent Neural Networks: Design and Applications
Recurrent Neural Networks: Design and Applications
Backpropagation and Recurrent Neural Networks in Financial Analysis of Multiple Stock Market Returns
HICSS '96 Proceedings of the 29th Hawaii International Conference on System Sciences Volume 2: Decision Support and Knowledge-Based Systems
Training general dynamic neural networks
Training general dynamic neural networks
2005 Special Issue: The loading problem for recursive neural networks
Neural Networks - Special issue on neural networks and kernel methods for structured domains
ICML '06 Proceedings of the 23rd international conference on Machine learning
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Sequence labelling in structured domains with hierarchical recurrent neural networks
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
New results on recurrent network training: unifying the algorithms and accelerating convergence
IEEE Transactions on Neural Networks
Stability analysis of discrete-time recurrent neural networks
IEEE Transactions on Neural Networks
Backpropagation Algorithms for a Broad Class of Dynamic Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper gives a detailed analysis of the error surfaces of certain recurrent networks and explains some difficulties encountered in training recurrent networks. We show that these error surfaces contain many spurious valleys, and we analyze the mechanisms that cause the valleys to appear. We demonstrate that the principle mechanism can be understood through the analysis of the roots of random polynomials. This paper also provides suggestions for improvements in batch training procedures that can help avoid the difficulties caused by spurious valleys, thereby improving training speed and reliability.