Discontinuities in recurrent neural networks
Neural Computation
Computing with continuous-time Liapunov systems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Neural and Super-Turing Computing
Minds and Machines
Some Afterthoughts on Hopfield Networks
SOFSEM '99 Proceedings of the 26th Conference on Current Trends in Theory and Practice of Informatics on Theory and Practice of Informatics
Robust Implementaion of Finite Automata by Recurrent RBF Networks
SOFSEM '00 Proceedings of the 27th Conference on Current Trends in Theory and Practice of Informatics
Continuous-time symmetric Hopfield nets are computationally universal
Neural Computation
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
Neural Computation
Oracles and Advice as Measurements
UC '08 Proceedings of the 7th international conference on Unconventional Computing
The expressive power of analog recurrent neural networks on infinite input streams
Theoretical Computer Science
Hi-index | 754.84 |
The computational power of recurrent neural networks is shown to depend ultimately on the complexity of the real constants (weights) of the network. The complexity, or information contents, of the weights is measured by a variant of resource-bounded Kolmogorov (1965) complexity, taking into account the time required for constructing the numbers. In particular, we reveal a full and proper hierarchy of nonuniform complexity classes associated with networks having weights of increasing Kolmogorov complexity