Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The Complexity of Computing
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Computation: finite and infinite machines
Computation: finite and infinite machines
On the computational power of neural nets
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Journal of the ACM (JACM)
On the effect of analog noise in discrete-time analog computations
Neural Computation
Approximating the Semantics of Logic Programs by Recurrent Neural Networks
Applied Intelligence
Computational complexity of neural networks: a survey
Nordic Journal of Computing
Finite-state computation in analog neural networks: steps towards biologically plausible models?
Emergent neural computational architectures based on neuroscience
Robust Implementaion of Finite Automata by Recurrent RBF Networks
SOFSEM '00 Proceedings of the 27th Conference on Current Trends in Theory and Practice of Informatics
Finite-State Computation in Analog Neural Networks: Steps towards Biologically Plausible Models?
Emergent Neural Computational Architectures Based on Neuroscience - Towards Neuroscience-Inspired Computing
The computational power of discrete hopfield nets with hidden units
Neural Computation
Group-Linking Method: A Unified Benchmark for Machine Learning with Recurrent Neural Network
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
The complexity of regular(-like) expressions
DLT'10 Proceedings of the 14th international conference on Developments in language theory
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Hi-index | 0.00 |
Let K(m) denote the smallest number with the property that every m-state finite automaton can be built as a neural net using K(m) or fewer neurons. A counting argument shows that K(m) is at least &OHgr;((m log m)1/3), and a construction shows that K(m) is at most O(m3/4). The counting argument and the construction allow neural nets with arbitrarily complex local structure and thus may require neurons that themselves amount to complicated networks. Mild, and in practical situations almost necessary, constraints on the local structure of the network give, again by a counting argument and a construction, lower and upper bounds for K(m) that are both linear in m.