Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Recursively enumerable sets and degrees
Recursively enumerable sets and degrees
Probabilistic inductive inference
Journal of the ACM (JACM)
Learning via queries to an oracle
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Monotonic and non-monotonic inductive inference
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Journal of the ACM (JACM)
Language learning in dependence on the space of hypotheses
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Probability is more powerful than team for language identification from positive data
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the structure of degrees of inferability
Journal of Computer and System Sciences
Probabilistic language learning under monotonicity constraints
Theoretical Computer Science - Special issue on algorithmic learning theory
Theoretical Computer Science - Special issue on algorithmic learning theory
Aspects of complexity of conservative probabilistic learning
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
Inductive Inference: Theory and Methods
ACM Computing Surveys (CSUR)
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
A Guided Tour Across the Boundaries of Learning Recursive Languages
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
Probabilistic Language Learning Under Monotonicity Constraints
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
Monotonic and Nonmonotonic Inductive Inference of Functions and Patterns
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
A Thesis in Inductive Inference
Proceedings of the 1st International Workshop on Nonmonotonic and Inductive Logic
Monotonic Versus Nonmonotonic Language Learning
Proceedings of the Second International Workshop on Nonmonotonic and Inductive Logic
Hi-index | 0.01 |
In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints even if the probability is close to 1 provided the learning machines are restricted to proper or class preserving hypothesis spaces (cf. [19]). In this paper, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. In particular, we deal with the question how much "additional information" provided by oracles is necessary in order to compensate the additional power of probabilistic learning. If the oracle machines have access to K-oracle, then they can compensate the power of monotonic (conservative) probabilistic machines completely, provided the probability p is greater than 2/3 (1/2). Furthermore, we show that for every recursively enumerable oracle A, there exists a learning problem which is strong-monotonically learnable by an oracle machine having access to A, but not conservatively or monotonically learnable with any probability p 0. A similar result holds for Peano-complete oracles. However, probabilistic learning under monotonicity constraints is "rich" enough to encode every recursively enumerable set in a characteristic learning problem, i.e., for every recursively enumerable set A, and every p 2/3, there exists a learning problem LA which is monotonically learnable with probability p, and monotonically learnable with oracle B if and only if A is Turing-reducible to B. The same result holds for conservative probabilistic learning with p 1/2, and strong-monotonic learning with probability p = 2/3. In particular, it follows that probabilistic learning under monotonicity constraints cannot be characterized in terms of oracle identification. Moreover, we close an open problem that appeared in [19] by showing that the probabilistic hierarchies of class preserving monotonic and conservative probabilistic learning are dense. Finally, we show that these probability bounds are strict, i.e., in the case of monotonic probabilistic learning with probability p = 2/3, conservative probabilistic learning with probability p = 1/2, and strong-monotonic probabilistic learning with probability p = 1/2, K is not sufficient to compensate the power of probabilistic learning under monotonicity constraints.