Efficient simulation of finite automata by neural nets
Journal of the ACM (JACM)
Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks
Machine Learning - Connectionist approaches to language learning
The Induction of Dynamical Recognizers
Machine Learning - Connectionist approaches to language learning
Induction of finite-state languages using second-order recurrent networks
Neural Computation
Learning finite machines with self-clustering recurrent networks
Neural Computation
Constructing deterministic finite-state automata in recurrent neural networks
Constructing deterministic finite-state automata in recurrent neural networks
Computation: finite and infinite machines
Computation: finite and infinite machines
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Grammatical Inference using an Adaptive Recurrent Neural Network
Neural Processing Letters
Rule Revision With Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Simple Strategies to Encode Tree Automata in Sigmoid Recursive Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
Group-Linking Method: A Unified Benchmark for Machine Learning with Recurrent Neural Network
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Hybrid preference machines based on inspiration from neuroscience
Cognitive Systems Research
Hi-index | 0.01 |
We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationship between the weight strength and the maximum allowed network size, which guarantees finite-state behavior of the constructed network. We illustrate the method by encoding random DFAs with 10, 100, and 1000 states. While the theory predicts that the weight strength scales with the DFA size, we find empirically the weight strength to be almost constant for all the random DFAs. These results can be explained by noting that the generated DFAs represent average cases. We empirically demonstrate the existence of extreme DFAs for which the weight strength scales with DFA size.