Knowledge Extraction from Transducer Neural Networks
Applied Intelligence
Simple Strategies to Encode Tree Automata in Sigmoid Recursive Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Encoding Nondeterministic Finite-State Tree Automata in Sigmoid Recursive Neural Networks
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
On the Need for a Neural Abstract Machine
Sequence Learning - Paradigms, Algorithms, and Applications
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
Analysis and design of echo state networks
Neural Computation
State-dependent computation using coupled recurrent networks
Neural Computation
Group-Linking Method: A Unified Benchmark for Machine Learning with Recurrent Neural Network
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
A decade of Kasabov's evolving connectionist systems: a review
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A novel approach for distributed application scheduling based on prediction of communication events
Future Generation Computer Systems
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
Identification of finite state automata with a class of recurrent neural networks
IEEE Transactions on Neural Networks
A hierarchical classification of first-order recurrent neural networks
LATA'10 Proceedings of the 4th international conference on Language and Automata Theory and Applications
Quantized Neural Modeling: Hybrid Quantized Architecture in Elman Networks
Neural Processing Letters
Hi-index | 0.00 |
Recently, Elman (1991) has proposed a simple recurrent network which is able to identify and classify temporal patterns. Despite the fact that Elman networks have been used extensively in many different fields, their theoretical capabilities have not been completely defined. Research in the 1960's showed that for every finite state machine there exists a recurrent artificial neural network which approximates it to an arbitrary degree of precision. This paper extends that result to architectures meeting the constraints of Elman networks, thus proving that their computational power is as great as that of finite state machines