Handbook of theoretical computer science (vol. B)
Theoretical Computer Science
Analog computation via neural networks
Theoretical Computer Science
On the computational power of neural nets
Journal of Computer and System Sciences
The dynamic universality of sigmoidal neural networks
Information and Computation
Handbook of formal languages, vol. 3
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
The computer and the brain
Minds and Machines
Neural and Super-Turing Computing
Minds and Machines
A Topological View of Some Problems in Complexity Theory
Proceedings of the Mathematical Foundations of Computer Science 1984
Beyond the Turing Limit: Evolving Interactive Systems
SOFSEM '01 Proceedings of the 28th Conference on Current Trends in Theory and Practice of Informatics Piestany: Theory and Practice of Informatics
Borel sets and circuit complexity
STOC '83 Proceedings of the fifteenth annual ACM symposium on Theory of computing
Computation: finite and infinite machines
Computation: finite and infinite machines
Borel ranks and Wadge degrees of context free $\omega$-languages
Mathematical Structures in Computer Science
Interactive Computation: The New Paradigm
Interactive Computation: The New Paradigm
How We Think of Computing Today
CiE '08 Proceedings of the 4th conference on Computability in Europe: Logic and Theory of Algorithms
A hierarchical classification of first-order recurrent neural networks
LATA'10 Proceedings of the 4th international conference on Language and Automata Theory and Applications
Computational power of neural networks: a characterization in terms of Kolmogorov complexity
IEEE Transactions on Information Theory
Hi-index | 5.23 |
We consider analog recurrent neural networks working on infinite input streams, provide a complete topological characterization of their expressive power, and compare it to the expressive power of classical infinite word reading abstract machines. More precisely, we consider analog recurrent neural networks as language recognizers over the Cantor space, and prove that the classes of @w-languages recognized by deterministic and non-deterministic analog networks correspond precisely to the respective classes of @P"2^0-sets and @S"1^1-sets of the Cantor space. Furthermore, we show that the result can be generalized to more expressive analog networks equipped with any kind of Borel accepting condition. Therefore, in the deterministic case, the expressive power of analog neural nets turns out to be comparable to the expressive power of any kind of Buchi abstract machine, whereas in the non-deterministic case, analog recurrent networks turn out to be strictly more expressive than any other kind of Buchi or Muller abstract machine, including the main cases of classical automata, 1-counter automata, k-counter automata, pushdown automata, and Turing machines.