Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Neural computing: an introduction
Neural computing: an introduction
A continuous input RAM-based stochastic neural model
Neural Networks
Machine Learning - Special issue on multistrategy learning
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
pRAM nets for detection of small targets in sequences of infra-red images
Neural Networks - Special issue: automatic target recognition
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering
Delay Learning in Artificial Neural Networks
Delay Learning in Artificial Neural Networks
Neural Networks and Structured Knowledge: Rule Extraction andApplications
Applied Intelligence
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
A theoretical and experimental account of n-tuple classifier performance
Neural Computation
Rule Extraction from Support Vector Machines
Rule Extraction from Support Vector Machines
Extraction of rules from artificial neural networks for nonlinear regression
IEEE Transactions on Neural Networks
Equivalence between RAM-based neural networks and probabilistic automata
IEEE Transactions on Neural Networks
Next challenges for adaptive learning systems
ACM SIGKDD Explorations Newsletter
Hi-index | 0.00 |
A hybrid system using weightless neural networks (WNNs) and finite state automata is described in this paper. With the use of such a system, rules can be inserted and extracted into/from WNNs. The rule insertion and extraction problems are described with a detailed discussion of the advantages and disadvantages of the rule insertion and extraction algorithms proposed. The process of rule insertion and rule extraction in WNNs is often more natural than in other neural network models.