Encoding of Probabilistic Automata into RAM-Based Neural Networks

  • Authors:
  • Affiliations:
  • Venue:
  • IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 3 - Volume 3
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

As one of the contributions of this paper, a new recognition algorithm to be used with a class of RAM-based neural networks or weightless neural networks, called General Single-layer Sequential Weightless Neural Networks (GSSWNNs), is introduced. These networks are assumed to be implemented with either p RAM nodes or MPLNs. The new algorithm makes such networks behave as probabilistic automata. The computability of GSSWNNs is shown to be equivalent to that of probabilistic automata. Indeed, one of the proofs provides an algorithm to map any probabilistic automaton into a GSSWNN. In others words, the proposed method not only allows the construction of any probabilistic automaton, but also increases the class of functions that can be computed by such networks. For instance, these networks are not restricted to finite-state languages and can now deal with some context-free languages.