Communications of the ACM
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
The position of index sets of identifiable sets in the arithmetical hierarchy
Information and Control
Recursively enumerable sets and degrees
Recursively enumerable sets and degrees
Learning regular sets from queries and counterexamples
Information and Computation
Probabilistic inductive inference
Journal of the ACM (JACM)
Learning via queries to an oracle
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Monotonic and non-monotonic inductive inference
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Types of monotonic language learning and their characterization
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Journal of the ACM (JACM)
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the intrinsic complexity of language identification
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
On the intrinsic complexity of learning
Information and Computation
On the structure of degrees of inferability
Journal of Computer and System Sciences
Monotonic and dual monotonic language learning
Theoretical Computer Science
Probabilistic language learning under monotonicity constraints
Theoretical Computer Science - Special issue on algorithmic learning theory
Theoretical Computer Science - Special issue on algorithmic learning theory
Aspects of complexity of conservative probabilistic learning
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Generalization and specialization strategies for learning r.e. languages
Annals of Mathematics and Artificial Intelligence
Identifying nearly minimal Gödel numbers from additional information
Annals of Mathematics and Artificial Intelligence
On the Query Complexity of Sets
MFCS '96 Proceedings of the 21st International Symposium on Mathematical Foundations of Computer Science
A Guided Tour Across the Boundaries of Learning Recursive Languages
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
EuroCOLT '97 Proceedings of the Third European Conference on Computational Learning Theory
Refined Query Inference (Extended Abstract)
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Probabilistic Language Learning Under Monotonicity Constraints
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
Monotonic and Nonmonotonic Inductive Inference of Functions and Patterns
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
A Thesis in Inductive Inference
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
Monotonic Versus Nonmonotonic Language Learning
Proceedings of the Second International Workshop on Nonmonotonic and Inductive Logic
Hi-index | 0.00 |
In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints, even if the probability is close to 1, provided the learning machines are restricted to proper or class preserving hypothesis spaces (cf. Meyer, Theoret. Comput. Sci. 185 (1997) 81--128). In this paper, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. In particular, we deal with the question how much additional information provided by oracles is necessary for compensating the additional power of probabilistic learning machines. In Section 1, we show that K is necessary and sufficient to compensate the additional power of probabilistic learning machines in the case of conservative (monotonic) probabilistic learning with p 1/2 (p 2/3), and for strong-monotonic probabilistic learning with 1/2 p ≤ 2/3. In the case of strong-monotonic learning with p, however, every Peano-complete oracle is sufficient for compensating the power of probabilistic learning machines. In contrast, the oracle K is not sufficient for compensating the power of conservative and strong-monotonic probabilistic learning with probability p=1/2, and monotonic probabilistic learning with p=2=3. The main result in Section 2 is that for each oracle A ≤ TK, there exists an indexed family LA which is properly conservatively identifiable with p=1/2, and which exactly reflects the Turing degree of A, i.e., LA is properly conservatively identi/able by an oracle machine M[B] iff A≤T B. Thus, for every oracle A below K, we can construct a learning problem characterizing A within proper conservative learning. However, not every indexed family which is conservatively identifiable with probability p=1/2 reflects the Turing degree of an oracle. Hence, the conservative probabilistic learning classes are higher structured than the Turing degrees below K. Finally, we prove that there exist learning problems which are conservatively (monotonically) identifiable with probability p=1/2 (p=2/3), but conservatively (monotonically) identifiable only by oracle machines having access to TOT. For strong-monotonic learning, this result does not hold. Copyright 2001 Elsevier Science B.V. All rights reserved.