Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
On the complexity of inductive inference
Information and Control
Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Introduction to numerical analysis: 2nd edition
Introduction to numerical analysis: 2nd edition
On the role of procrastination in machine learning
Information and Computation
Subrecursive programming systems: complexity & succinctness
Subrecursive programming systems: complexity & succinctness
Regular Article: Open problems in “systems that learn”
Proceedings of the 30th IEEE symposium on Foundations of computer science
An introduction to computational learning theory
An introduction to computational learning theory
Language learning from texts: mindchanges, limited memory, and monotonicity
Information and Computation
A New Characterization of Type-2 Feasibility
SIAM Journal on Computing
Incremental learning from positive data
Journal of Computer and System Sciences
Elementary formal systems, intrinsic complexity, and procrastination
Information and Computation
SIAM Journal on Computing
Incremental concept learning for bounded data mining
Information and Computation
The Power of Vacillation in Language Learning
SIAM Journal on Computing
An average-case optimal one-variable pattern language learner
Journal of Computer and System Sciences - Eleventh annual conference on computational learning theory&slash;Twelfth Annual IEEE conference on computational complexity
Introduction To Automata Theory, Languages, And Computation
Introduction To Automata Theory, Languages, And Computation
Introduction to Algorithms
Machine Inductive Inference and Language Identification
Proceedings of the 9th Colloquium on Automata, Languages and Programming
On characterizations of the basic feasible functionals, Part I
Journal of Functional Programming
Polynomial and abstract subrecursive classes
Journal of Computer and System Sciences
Non U-shaped vacillatory and team learning
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Memory-limited u-shaped learning
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Parameterized Complexity
Hi-index | 0.00 |
Computability theoretic learning theory (machine inductive inference) typically involves learning programs for languages or functions from a stream of complete data about them and, importantly, allows mind changes as to conjectured programs. This theory takes into account algorithmicity but typically does nottake into account feasibilityof computational resources. This paper provides some example results and problems for three ways this theory can be constrained by computational feasibility. Considered are: the learner has memory limitations, the learned programs are desired to be optimal, and there are feasibility constraints on obtaining each output program andon the number of mind changes.