Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Recursively enumerable sets and degrees
Recursively enumerable sets and degrees
Prudence and other conditions on formal language learning
Information and Computation
COLT '88 Proceedings of the first annual workshop on Computational learning theory
An introduction to Kolmogorov complexity and its applications
An introduction to Kolmogorov complexity and its applications
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Characterizations of monotonic and dual monotonic language learning
Information and Computation
Monotonic and dual monotonic language learning
Theoretical Computer Science
Angluin's theorem for indexed families of r.e. sets and applications
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
The Power of Vacillation in Language Learning
SIAM Journal on Computing
Inductive Inference: Theory and Methods
ACM Computing Surveys (CSUR)
Machine Inductive Inference and Language Identification
Proceedings of the 9th Colloquium on Automata, Languages and Programming
A Guided Tour Across the Boundaries of Learning Recursive Languages
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
A Thesis in Inductive Inference
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
Separation of uniform learning classes
Theoretical Computer Science - Special issue: Algorithmic learning theory
Increasing the power of uniform inductive learners
Journal of Computer and System Sciences - Special issue on COLT 2002
Information and Computation
Learning in Friedberg numberings
Information and Computation
Hypothesis spaces for learning
Information and Computation
Learnability of co-r.e. classes
LATA'12 Proceedings of the 6th international conference on Language and Automata Theory and Applications
Hi-index | 5.23 |
This work extends studies of Angluin, Lange and Zeugmann on the dependence of learning on the hypothesis space chosen for the language class in the case of learning uniformly recursive language classes. The concepts of class-comprising (where the learner can choose a uniformly recursively enumerable superclass as the hypothesis space) and class-preserving (where the learner has to choose a uniformly recursively enumerable hypothesis space of the same class) are formulated in their study. In subsequent investigations, uniformly recursively enumerable hypothesis spaces have been considered. In the present work, we extend the above works by considering the question of whether learners can be effectively synthesized from a given hypothesis space in the context of learning uniformly recursively enumerable language classes. In our study, we introduce the concepts of prescribed learning (where there must be a learner for every uniformly recursively enumerable hypothesis space of the same class) and uniform learning (like prescribed, but the learner has to be synthesized effectively from an index of the hypothesis space). It is shown that while for explanatory learning, these four types of learnability coincide, some or all are different for other learning criteria. For example, for conservative learning, all four types are different. Several results are obtained for vacillatory and behaviourally correct learning; three of the four types can be separated, however the relation between prescribed and uniform learning remains open. It is also shown that every (not necessarily uniformly recursively enumerable) behaviourally correct learnable class has a prudent learner, that is, a learner using a hypothesis space such that the learner learns every set in the hypothesis space. Moreover the prudent learner can be effectively built from any learner for the class.