Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Computing with continuous-time Liapunov systems
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Advances in Algorithms, Languages, and Complexity
Advances in Algorithms, Languages, and Complexity
A theory of complexity for continuous time systems
Journal of Complexity
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
Neural Computation
Hi-index | 5.23 |
We consider the convergence behavior of a class of continuous-time dynamical systems corresponding to so-called symmetric Hopfield nets studied in neural networks theory. We prove that such systems may have transient times that are exponential in the system dimension (i.e. number of "neurons"), despite the fact that their dynamics are controlled by Liapunov functions. This result stands in contrast to many proposed uses of such systems in, e.g. combinatorial optimization applications, in which it is often implicitly assumed that their convergence is rapid. An additional interesting observation is that our example of an exponential-transient continuous-time system (a simulated binary counter) in fact converges more slowly than any discrete-time Hopfield system of the same representation size. This suggests that continuous-time systems may be worth investigating for gains in descriptional efficiency as compared to their discrete-time counterparts.