Elements of applied bifurcation theory (2nd ed.)
Elements of applied bifurcation theory (2nd ed.)
Architectural bias in recurrent neural networks: fractal analysis
Neural Computation
Finite state automata and simple recurrent networks
Neural Computation
Hi-index | 0.00 |
Recurrent neural networks unlike feed-forward networks are able to process inputs with time context. The key role in this process is played by the dynamics of the network, which transforms input data to the recurrent layer states. Several authors have described and analyzed dynamics of small sized recurrent neural networks with two or three hidden units. In our work we introduce techniques that allow to visualize and analyze the dynamics of large recurrent neural networks with dozens units, reveal both stable and unstable points (attractors and saddle points), which are important to understand the principles of successful task processing. As a practical example of this approach, dynamics of the simple recurrent network trained by two different training algorithms on context-free language anbnwas studied.