Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Computing with continuous-time Liapunov systems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Recurrent Neural Networks: Design and Applications
Recurrent Neural Networks: Design and Applications
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
Neural Computation
Hi-index | 0.01 |
We establish a fundamental result in the theory of continuous-time neural computation, by showing that so called continuoustime symmetric Hopfield nets, whose asymptotic convergence is always guaranteed by the existence of a Liapunov function may, in the worst case, possess a transient period that is exponential in the network size. The result stands in contrast to e.g. the use of such network models in combinatorial optimization applications.