Efficient Incremental Model for Learning Context-Free Grammars from Positive Structural Examples

  • Authors:
  • Gend Lal Prajapati;Narendra S. Chaudhari;Manohar Chandwani

  • Affiliations:
  • Institute of Engineering & Technology, Department of Computer Engineering, Devi Ahilya University, Indore, India 452017;School of Computer Engineering, Nanyang Technological University, Singapore 639798;Institute of Engineering & Technology, Department of Computer Engineering, Devi Ahilya University, Indore, India 452017

  • Venue:
  • SETN '08 Proceedings of the 5th Hellenic conference on Artificial Intelligence: Theories, Models and Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a formalization based on tree automata for incremental learning of context-free grammars from positive samples of their structural descriptions. A structural description of a context-free grammar is a derivation tree of the grammar in which labels are removed. The tree automata based learning in this paradigm is early introduced by Sakakibara in 1992, however his scheme assumes that all training examples are available to the learning algorithm at the beginning (i.e., it cannot be employed as an online learning) and also it doesn't optimize the storage requirements as well. Our model has several desirable features that runs in O (n 3) time in the sum of the sizes of the input examples, obtains O (n ) storage space saving, achieves good incremental behavior by updating a guess incrementally and infers a grammar from positive-only examples efficiently. Several examples and experimental results are given to illustrate the scheme and its efficient execution.