Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Prudence and other conditions on formal language learning
Information and Computation
Polynomial-time inference of arbitrary pattern languages
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the role of procrastination in machine learning
Information and Computation
Rich classes inferable from positive data
Information and Computation
On the intrinsic complexity of learning
Information and Computation
Incremental learning from positive data
Journal of Computer and System Sciences
Handbook of formal languages, vol. 3
Incremental concept learning for bounded data mining
Information and Computation
Journal of the ACM (JACM)
Marcus Contextual Grammars
Machine Inductive Inference and Language Identification
Proceedings of the 9th Colloquium on Automata, Languages and Programming
Inductive Inference, DFAs, and Computational Complexity
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Contextual grammars as generative models of natural languages
Computational Linguistics
Results on memory-limited U-shaped learning
Information and Computation
Learning efficiency of very simple grammars from positive data
Theoretical Computer Science
Parallelism increases iterative learning power
Theoretical Computer Science
Inferring grammars for mildly context sensitive languages in polynomial-time
ICGI'06 Proceedings of the 8th international conference on Grammatical Inference: algorithms and applications
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Theoretical Computer Science
Theoretical Computer Science
Hi-index | 5.23 |
It is investigated for which choice of a parameter q, denoting the number of contexts, the class of simple external contextual languages is iteratively learnable. On the one hand, the class admits, for all values of q, polynomial time learnability provided an adequate choice of the hypothesis space is given. On the other hand, additional constraints like consistency and conservativeness or the use of a one-one hypothesis space changes the picture - iterative learning limits the long term memory of the learner to the current hypothesis and these constraints further hinder storage of information via padding of this hypothesis. It is shown that if q3, then simple external contextual languages are not iteratively learnable using a class preserving one-one hypothesis space, while for q=1 it is iteratively learnable, even in polynomial time. It is also investigated for which choice of the parameters the simple external contextual languages can be learnt by a consistent and conservative iterative learner.