Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Language learning from texts: mindchanges, limited memory, and monotonicity
Information and Computation
On the impact of forgetting on learning machines
Journal of the ACM (JACM)
Computational Complexity of One-Tape Turing Machine Computations
Journal of the ACM (JACM)
Inductive Inference, DFAs, and Computational Complexity
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Automatic Presentations of Structures
LCC '94 Selected Papers from the International Workshop on Logical and Computational Complexity
LICS '00 Proceedings of the 15th Annual IEEE Symposium on Logic in Computer Science
Learning indexed families of recursive languages from positive data: A survey
Theoretical Computer Science
Crossing sequences and off-line turing machine computations
FOCS '65 Proceedings of the 6th Annual Symposium on Switching Circuit Theory and Logical Design (SWCT 1965)
Automatic learning of subclasses of pattern languages
LATA'11 Proceedings of the 5th international conference on Language and automata theory and applications
Learnability of automatic classes
LATA'10 Proceedings of the 4th international conference on Language and Automata Theory and Applications
Hi-index | 0.00 |
The present work determines the exact nature of linear time computable notions which characterise automatic functions (those whose graphs are recognised by a finite automaton). The paper also determines which type of linear time notions permit full learnability for learning in the limit of automatic classes (families of languages which are uniformly recognised by a finite automaton). In particular it is shown that a function is automatic iff there is a one-tape Turing machine with a left end which computes the function in linear time where the input before the computation and the output after the computation both start at the left end. It is known that learners realised as automatic update functions are restrictive for learning. In the present work it is shown that one can overcome the problem by providing work-tapes additional to a resource-bounded base tape while keeping the update-time to be linear in the length of the largest datum seen so far. In this model, one additional such worktape provides additional learning power over the automatic learner model and the two-work-tape model gives full learning power.