Structural complexity 1
Circuit complexity and neural networks
Circuit complexity and neural networks
Analog computation via neural networks
Theoretical Computer Science
Computability with low-dimensional dynamical systems
Theoretical Computer Science
On the computational power of neural nets
Journal of Computer and System Sciences
Journal of the ACM (JACM)
On the effect of analog noise in discrete-time analog computations
Neural Computation
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
On some Relations between Dynamical Systems and Transition Systems
ICALP '94 Proceedings of the 21st International Colloquium on Automata, Languages and Programming
The Computational Power of Continuous Time Neural Networks
SOFSEM '97 Proceedings of the 24th Seminar on Current Trends in Theory and Practice of Informatics: Theory and Practice of Informatics
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
Neural Computation
The computational power of discrete hopfield nets with hidden units
Neural Computation
Computational power of neural networks: a characterization in terms of Kolmogorov complexity
IEEE Transactions on Information Theory
Exponential Transients in Continuous-Time Symmetric Hopfield Nets
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Continuous-time symmetric Hopfield nets are computationally universal
Neural Computation
Exponential transients in continuous-time Liapunov systems
Theoretical Computer Science
Abstract geometrical computation: turing-computing ability and undecidability
CiE'05 Proceedings of the First international conference on Computability in Europe: new Computational Paradigms
Hi-index | 0.03 |
We establish a fundamental result in the theory of computation by continuous-time dynamical systems, by showing that systems corresponding to so called continuous-time symmetric Hopfield nets are capable of general computation. More precisely, we prove that any function computed by a discrete-time asymmetric recurrent network of n threshold gates can also be computed by a continuous-time symmetrically-coupled Hopfield system of dimension 18n+7. Moreover, if the threshold logic network has maximum weight w_{\max} and converges in discrete time t^*, then the corresponding Hopfield system can be designed to operate in continuous time &THgr;(t^*/&egr;), for any value 0w_{\max}2^{3n}\leq\&egr; 2^{1/&egr;}.The result appears at first sight counterintuitive, because the dynamics of any symmetric Hopfield system is constrained by a Liapunov, or energy function defined on its state space. In particular, such a system always converges from any initial state towards some stable equilibrium state, and hence cannot exhibit nondamping oscillations, i.e. strictly speaking cannot simulate even a single alternating bit. However, we show that if one only considers terminating computations, then the Liapunov constraint can be overcome, and one can in fact embed arbitrarily complicated computations in the dynamics of Liapunov systems with only a modest cost in the system's dimensionality.In terms of standard discrete computation models, our result implies that any polynomially space-bounded Turing machine can be simulated by a family of polynomial-size continuous-time symmetric Hopfield nets.