Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
On uniform learnability of language families
Information Processing Letters
Strong separation of learning classes
Journal of Experimental & Theoretical Artificial Intelligence
How inductive inference strategies discover their errors
Information and Computation
The synthesis of language learners
Information and Computation
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
On the Comparison of Inductive Inference Criteria for Uniform Learning of Finite Classes
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Synthesizing inductive expertise
Information and Computation
Increasing the power of uniform inductive learners
Journal of Computer and System Sciences - Special issue on COLT 2002
Hi-index | 0.00 |
The fundamental learning model considered here is identification of recursive functions in the limit as introduced by Gold [8], but the concept is investigated on a meta-level. A set of classes of recursive functions is uniformly learnable under an inference criterion I, if there is a single learner, which synthesizes a learner for each of these classes from a corresponding description of the class. The particular question discussed here is how unions of uniformly learnable sets of such classes can still be identified uniformly. Especially unions of classes leading to strong separations of inference criteria in the uniform model are considered. The main result is that for any pair (I, I驴) of different inference criteria considered here there exists a fixed set of descriptions of learning problems from I, such that its union with any uniformly I-learnable collection is uniformly I驴-learnable, but no longer uniformly I-learnable.