Higher recursion theory
Monotonic and non-monotonic inductive inference
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the role of procrastination in machine learning
Information and Computation
Characterizations of monotonic and dual monotonic language learning
Information and Computation
Monotonic and dual monotonic language learning
Theoretical Computer Science
Angluin's theorem for indexed families of r.e. sets and applications
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Incremental learning from positive data
Journal of Computer and System Sciences
Elementary formal systems, intrinsic complexity, and procrastination
Information and Computation
Ordinal mind change complexity of language identification
Theoretical Computer Science
Mind change complexity of learning logic programs
Theoretical Computer Science
A Thesis in Inductive Inference
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
Counting extensional differences in BC-learning
Information and Computation
On the classification of recursive languages
Information and Computation
Dynamically Delayed Postdictive Completeness and Consistency in Learning
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Theoretical Computer Science
Mind change complexity of inferring unbounded unions of pattern languages from positive data
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
Gold introduced the notion of learning in the limit where a class S is learnable iff there is a recursive machine M which reads the course of values of a function f and converges to a program for f whenever f is in S. An important measure for the speed of convergence in this model is the quantity of mind changes before the onset of convergence. The oldest model is to consider a constant bound on the number of mind changes M makes on any input function; such a bound is referred here as type 1. Later this was generalized to a bound of type 2 where a counter ranges over constructive ordinals and is counted down at every mind change. Although ordinal bounds permit the inference of richer concept classes than constant bounds, they still are a severe restriction. Therefore the present work introduces two more general approaches to bounding mind changes. These are based on counting by going down in a linearly ordered set (type 3) and on counting by going down in a partially ordered set (type 4). In both cases the set must not contain infinite descending recursive sequences. These four types of mind changes yield a hierarchy and there are identifiable classes that cannot be learned with the most general mind change bound of type 4. It is shown that existence of type 2 bound is equivalent to the existence of a learning algorithm which converges on every (also nonrecursive) input function and the existence of type 4 is shown to be equivalent to the existence of a learning algorithm which converges on every recursive function. A partial characterization of type 3 yields, a result of independent interest in recursion theory. The interplay between mind change complexity and choice of hypothesis space is investigated. It is established that for certain concept classes, a more expressive hypothesis space can sometimes reduce mind change complexity of learning these classes. The notion of mind change bound for behaviourally correct learning is indirectly addressed by employing the above four types to restrict the number of predictive errors of commission in finite error next value learning (NV'') -- a model equivalent to behaviourally correct learning. Again, natural characterizations for type 2 and type 4 bounds are derived. Their naturalness is further illustrated by characterizing them in terms of branches of uniformly recursive families of binary trees.