Communications of the ACM
On the complexity of inductive inference
Information and Control
Polynomial Time Learnability of Simple Deterministic Languages
Machine Learning
Prediction-preserving reducibility
Journal of Computer and System Sciences - 3rd Annual Conference on Structure in Complexity Theory, June 14–17, 1988
Polynomial-time inference of arbitrary pattern languages
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
On the impact of forgetting on learning machines
On the impact of forgetting on learning machines
On the learnability of monotone k&mgr;-DNF formulae under product distributions
Information Processing Letters
A framework for polynomial-time query learnability
Mathematical Systems Theory
On the intrinsic complexity of learning
Information and Computation
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
The structure of intrinsic complexity of learning
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Inductive Inference, DFAs, and Computational Complexity
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Too Much Can be Too Much for Learning Efficiently
AII '92 Proceedings of the International Workshop on Analogical and Inductive Inference
Monotonicity versus Efficiency for Learning Languages from Texts
AII '94 Proceedings of the 4th International Workshop on Analogical and Inductive Inference: Algorithmic Learning Theory
Consistent Identification in the Limit of Rigid Grammars from Strings Is NP-hard
ICGI '02 Proceedings of the 6th International Colloquium on Grammatical Inference: Algorithms and Applications
Consistent Identification in the Limit of Any of the Classes k -Valued Is NP-hard
LACL '01 Proceedings of the 4th International Conference on Logical Aspects of Computational Linguistics
Robust Learning - Rich and Poor
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Hi-index | 0.00 |
This paper aims to extend well known results about polynomial update-time bounded learning strategies in a recursion theoretic setting. We generalize the update-time complexity using the approach of Blum's computational complexity measures. It will turn out, that consistency is the first natural condition having a narrowing effect for arbitrary update boundaries. We show the existence of arbitrary hard, as well as an infinite chain of harder and harder consistent learnable sets. The complexity gap between consistent und inconsistent strategies solving the same problem can be arbitary large. We prove an exact characterization for polynomial consistent learnability, giving a deeper insight in the problem of hard consistent learnability.