Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Efficient simulation of finite automata by neural nets
Journal of the ACM (JACM)
Training second-order recurrent neural networks using hints
ML92 Proceedings of the ninth international workshop on Machine learning
Learning finite machines with self-clustering recurrent networks
Neural Computation
On the node complexity of neural networks
Neural Networks
On the computational power of neural nets
Journal of Computer and System Sciences
The dynamic universality of sigmoidal neural networks
Information and Computation
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Journal of the ACM (JACM)
On the effect of analog noise in discrete-time analog computations
Neural Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Fast learning in networks of locally-tuned processing units
Neural Computation
Computational power of neural networks: a characterization in terms of Kolmogorov complexity
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this paper a recurrent network, which consists of O(√m log m) RBF (radial basis functions) units with maximum norm employing any activation function that has different values in at least two nonnegative points, is constructed so as to implement a given deterministic finite automaton with m states. The underlying simulation proves to be robust with respect to analog noise for a large class of smooth activation functions with a special type of inflexion.