Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Prudence and other conditions on formal language learning
Information and Computation
Polynomial-time inference of arbitrary pattern languages
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Rich classes inferable from positive data
Information and Computation
Incremental learning from positive data
Journal of Computer and System Sciences
Handbook of formal languages, vol. 3
Journal of the ACM (JACM)
Marcus Contextual Grammars
Machine Inductive Inference and Language Identification
Proceedings of the 9th Colloquium on Automata, Languages and Programming
Inductive Inference, DFAs, and Computational Complexity
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Contextual grammars as generative models of natural languages
Computational Linguistics
Parallelism Increases Iterative Learning Power
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Learning Efficiency of Very Simple Grammars from Positive Data
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Inferring grammars for mildly context sensitive languages in polynomial-time
ICGI'06 Proceedings of the 8th international conference on Grammatical Inference: algorithms and applications
Theoretical Computer Science
Bio-inspired grammatical inference
IWINAC'11 Proceedings of the 4th international conference on Interplay between natural and artificial computation - Volume Part I
Hi-index | 0.00 |
It is investigated for which choice of a parameter q, denoting the number of contexts, the class of simple external contextual languages is iteratively learnable. On one hand, the class admits, for all values of q, polynomial time learnability provided an adequate choice of the hypothesis space is given. On the other hand, additional constraints like consistency and conservativeness or the use of a one-one hypothesis space changes the picture -- iterative learning limits the long term memory of the learner to the current hypothesis and these constraints further hinder storage of information via padding of this hypothesis. It is shown that if q 3, then simple external contextual languages are not iteratively learnable using a class preserving one-one hypothesis space, while for q= 1 it is iteratively learnable, even in polynomial time. For the intermediate levels, there is some indication that iterative learnability using a class preserving one-one hypothesis space might depend on the size of the alphabet. It is also investigated for which choice of the parameters, the simple external contextual languages can be learnt by a consistent and conservative iterative learner.