Structural complexity 1
Circuit complexity and neural networks
Circuit complexity and neural networks
Analog computation via neural networks
Theoretical Computer Science
Computability with low-dimensional dynamical systems
Theoretical Computer Science
On the computational power of neural nets
Journal of Computer and System Sciences
Journal of the ACM (JACM)
On the effect of analog noise in discrete-time analog computations
Neural Computation
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Computing with continuous-time Liapunov systems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Advances in Algorithms, Languages, and Complexity
Advances in Algorithms, Languages, and Complexity
Recurrent Neural Networks: Design and Applications
Recurrent Neural Networks: Design and Applications
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
A theory of complexity for continuous time systems
Journal of Complexity
On some Relations between Dynamical Systems and Transition Systems
ICALP '94 Proceedings of the 21st International Colloquium on Automata, Languages and Programming
The Computational Power of Continuous Time Neural Networks
SOFSEM '97 Proceedings of the 24th Seminar on Current Trends in Theory and Practice of Informatics: Theory and Practice of Informatics
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
Neural Computation
The computational power of discrete hopfield nets with hidden units
Neural Computation
Computational power of neural networks: a characterization in terms of Kolmogorov complexity
IEEE Transactions on Information Theory
Hopfield Network as Static Optimizer: Learning the Weights and Eliminating the Guesswork
Neural Processing Letters
Hi-index | 0.00 |
We establish a fundamental result in the theory of computation by continuous-time dynamical systems by showing that systems corresponding to so-called continuous-time symmetric Hopfield nets are capable of general computation. As is well known, such networks have very constrained Lyapunov-function controlled dynamics. Nevertheless, we show that they are universal and efficient computational devices, in the sense that any convergent synchronous fully parallel computation by a recurrent network of n discrete-time binary neurons, with in general asymmetric coupling weights, can be simulated by a symmetric continuous-time Hopfield net containing only 18n + 7 units employing the saturated-linear activation function. Moreover, if the asymmetric network has maximum integer weight size Wmax and converges in discrete time t*, then the corresponding Hopfield net can be designed to operate in continuous time Θ (t*/ε) for any ε 0 such that Wmax212n ≤ ε21/ε. In terms of standard discrete computation models, our result implies that any polynomially space-bounded Turing machine can be simulated by a family of polynomial-size continuous-time symmetric Hopfield nets.