Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Machine learning an artificial intelligence approach volume II
Machine learning an artificial intelligence approach volume II
Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Saving the phenomena: requirements that inductive inference machines not contradict known data
Information and Computation
Machine learning: an artificial intelligence approach volume III
Machine learning: an artificial intelligence approach volume III
How inductive inference strategies discover their errors
Information and Computation
The nature of statistical learning theory
The nature of statistical learning theory
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
Machine Learning
Learning classes of approximations to non-recursive functions
Theoretical Computer Science
Characterization Problems in the Theory of Inductive Inference
Proceedings of the Fifth Colloquium on Automata, Languages and Programming
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
Hi-index | 0.00 |
A consistent learner is required to correctly and completely reflect in its actual hypothesis all data received so far. Though this demand sounds quite plausible, it may lead to the unsolvability of the learning problem. Therefore, in the present paper several variations of consistent learning are introduced and studied. These variations allow a so-called @d-delay relaxing the consistency demand to all but the last @d data. Additionally, we introduce the notion of coherent learning (again with @d-delay) requiring the learner to correctly reflect only the last datum (only the n-@dth datum) seen. Our results are manyfold. First, we provide characterizations for consistent learning with @d-delay in terms of complexity and computable numberings. Second, we establish strict hierarchies for all consistent learning models with @d-delay in dependence on @d. Finally, it is shown that all models of coherent learning with @d-delay are exactly as powerful as their corresponding consistent learning models with @d-delay.