Online Learning versus Offline Learning
Machine Learning
On the power of incremental learning
Theoretical Computer Science
Variants of iterative learning
Theoretical Computer Science
Inferring state-based behavior models
Proceedings of the 2006 international workshop on Dynamic systems analysis
Active Coevolutionary Learning of Deterministic Finite Automata
The Journal of Machine Learning Research
Dynamic Detection of COTS Component Incompatibility
IEEE Software
Automatic generation of software behavioral models
Proceedings of the 30th international conference on Software engineering
AN UNSUPERVISED INCREMENTAL LEARNING ALGORITHM FOR DOMAIN-SPECIFIC LANGUAGE DEVELOPMENT
Applied Artificial Intelligence
Grammar-based classifier system: a universal tool for grammatical inference
WSEAS Transactions on Computers
CGE: a sequential learning algorithm for mealy automata
ICGI'10 Proceedings of the 10th international colloquium conference on Grammatical inference: theoretical results and applications
Learning regular expressions from noisy sequences
SARA'05 Proceedings of the 6th international conference on Abstraction, Reformulation and Approximation
Hi-index | 0.00 |
Connectionist learning models have had considerable empirical success, but it is hard to characterize exactly what they learn. The learning of finite-state languages (FSL) from example strings is a domain which has been extensively studied and might provide an opportunity to help understand connectionist learning. A major problem is that traditional FSL learning assumes the storage of all examples and thus violates connectionist principles. This paper presents a provably correct algorithm for inferring any minimum-state deterministic finite-state automata (FSA) from a complete ordered sample using limited total storage and without storing example strings. The algorithm is an iterative strategy that uses at each stage a current encoding of the data considered so far, and one single sample string. One of the crucial advantages of our algorithm is that the total amount of space used in the course of learning for encoding any finite prefix of the sample is polynomial in the size of the inferred minimum state deterministic FSA. The algorithm is also relatively efficient in time and has been implemented. More importantly, there is a connectionist version of the algorithm that preserves these properties. The connectionist version requires much more structure than the usual models and has been implemented using the Rochester Connectionist Simulator. We also show that no machine with finite working storage can iteratively identify the FSL from arbitrary presentations.