Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Prudence and other conditions on formal language learning
Information and Computation
Inductive Inference, DFAs, and Computational Complexity
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
A Thesis in Inductive Inference
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
Results on memory-limited U-shaped learning
Information and Computation
Information and Computation
Non-U-shaped vacillatory and team learning
Journal of Computer and System Sciences
U-shaped, iterative, and iterative-with-counter learning
Machine Learning
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Iterative learning from positive data and counters
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Memory-limited non-U-shaped learning with solved open problems
Theoretical Computer Science
Iterative learning from positive data and counters
Theoretical Computer Science
Hi-index | 0.00 |
Gold's original paper on inductive inference introduced a notion of an optimal learner. Intuitively, a learner identifies a class of objects optimally iff there is no otherlearner that: requires as littleof each presentation of each object in the class in order to identify that object, and, for somepresentation of someobject in the class, requires lessof that presentation in order to identify that object. Wiehagen considered this notion in the context of functionlearning, and characterizedan optimal function learner as one that is class-preserving, consistent, and (in a very strong sense) non-U-shaped, with respect to the class of functions learned.Herein, Gold's notion is considered in the context of languagelearning. Intuitively, a language learner identifies a class of languages optimally iff there is no other learner that: requires as little of each textfor each language in the class in order to identify that language, and, for some text for some language in the class, requires less of that text in order to identify that language.Many interesting results concerning optimal language learners are presented. First, it is shown that a characterization analogous to Wiehagen's does nothold in this setting. Specifically, optimality is notsufficient to guarantee Wiehagen's conditions; though, those conditions aresufficient to guarantee optimality. Second, it is shown that the failure of this analog is notdue to a restriction on algorithmic learning power imposed by non-U-shapedness (in the strong form employed by Wiehagen). That is, non-U-shapedness, even in this strong form, does notrestrict algorithmic learning power. Finally, for an arbitrary optimal learner Fof a class of languages $\mathcal {L}$, it is shown that Foptimally identifies a subclass $\mathcal {K}$ of $\mathcal {L}$ iff Fis class-preserving with respect to $\mathcal {K}$.