Learning from hints in neural networks
Journal of Complexity
Symbolic-neural systems and the use of hints for developing complex systems
International Journal of Man-Machine Studies
The Induction of Dynamical Recognizers
Machine Learning - Connectionist approaches to language learning
Training second-order recurrent neural networks using hints
ML92 Proceedings of the ninth international workshop on Machine learning
Induction of finite-state languages using second-order recurrent networks
Neural Computation
Proceedings of a workshop on Computational learning theory and natural learning systems (vol. 1) : constraints and prospects: constraints and prospects
Combining Symbolic and Neural Learning
Machine Learning
Extraction of rules from discrete-time recurrent neural networks
Neural Networks
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Unified Integration of Explicit Knowledge and Learning by Example in Recurrent Networks
IEEE Transactions on Knowledge and Data Engineering
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Machine Learning - Special issue on multistrategy learning
Natural Language Grammatical Inference with Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Inductive Bias in Recurrent Neural Networks
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Localization of Sound Sources by Means of Recurrent Neural Networks
RSCTC '00 Revised Papers from the Second International Conference on Rough Sets and Current Trends in Computing
Advances in evolutionary computing
Rule-based update methods for a hybrid rule base
Data & Knowledge Engineering
Rule Extraction from Recurrent Neural Networks: A Taxonomy and Review
Neural Computation
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Neuro-Symbolic Approaches for Knowledge Representation in Expert Systems
International Journal of Hybrid Intelligent Systems
Information Sciences: an International Journal
Hybrid thematic role processor: symbolic linguistic relations revised by connectionist learning
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Artificial Intelligence in Medicine
Hi-index | 0.00 |
Abstract-Recurrent neural networks readily process, recognize and generate temporal sequences. By encoding grammatical strings as temporal sequences, recurrent neural networks can be trained to behave like deterministic sequential finite-state automata. Algorithms have been developed for extracting grammatical rules from trained networks. Using a simple method for inserting prior knowledge (or rules) into recurrent neural networks, we show that recurrent neural networks are able to perform rule revision. Rule revision is performed by comparing the inserted rules with the rules in the finite-state automata extracted from trained networks. The results from training a recurrent neural network to recognize a known non-trivial, randomly generated regular grammar show that not only do the networks preserve correct rules but that they are able to correct through training inserted rules which were initially incorrect. (By incorrect, we mean that the rules were not the ones in the randomly generated grammar.)