Communications of the ACM
Efficient learning of context-free grammars from positive structural examples
Information and Computation
Recent advances of grammatical inference
Theoretical Computer Science - Special issue on algorithmic learning theory
Inference of Reversible Languages
Journal of the ACM (JACM)
The use of grammatical inference for designing programming languages
Communications of the ACM
Data Structures and Algorithms
Data Structures and Algorithms
Current Trends in Grammatical Inference
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Inside-outside reestimation from partially bracketed corpora
ACL '92 Proceedings of the 30th annual meeting on Association for Computational Linguistics
Grammatical Inference in Bioinformatics
IEEE Transactions on Pattern Analysis and Machine Intelligence
A bibliographical study of grammatical inference
Pattern Recognition
Hi-index | 0.00 |
This paper describes a formalization based on tree automata for incremental learning of context-free grammars from positive samples of their structural descriptions. A structural description of a context-free grammar is a derivation tree of the grammar in which labels are removed. The tree automata based learning in this paradigm is early introduced by Sakakibara in 1992, however his scheme assumes that all training examples are available to the learning algorithm at the beginning (i.e., it cannot be employed as an online learning) and also it doesn't optimize the storage requirements as well. Our model has several desirable features that runs in O (n 3) time in the sum of the sizes of the input examples, obtains O (n ) storage space saving, achieves good incremental behavior by updating a guess incrementally and infers a grammar from positive-only examples efficiently. Several examples and experimental results are given to illustrate the scheme and its efficient execution.