An algorithm for drawing general undirected graphs
Information Processing Letters
Grammatical Inference using an Adaptive Recurrent Neural Network
Neural Processing Letters
Handbook of Graphs and Networks: From the Genome to the Internet
Handbook of Graphs and Networks: From the Genome to the Internet
Finite state automata and simple recurrent networks
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Scalability of learning impact on complex parameters in recurrent neural networks
ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
Hi-index | 0.00 |
In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy.