Systems that learn: an introduction to learning theory for cognitive and computer scientists
Systems that learn: an introduction to learning theory for cognitive and computer scientists
Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Learning regular sets from queries and counterexamples
Information and Computation
Higher recursion theory
Inductive inference from all positive and some negative data
Information Processing Letters
On the non-existence of maximal inference degrees for language identification
Information Processing Letters
On the role of procrastination in machine learning
Information and Computation
Language learning with some negative information
Journal of Computer and System Sciences
A note on batch and incremental learnability
Journal of Computer and System Sciences
An Introduction to the General Theory of Algorithms
An Introduction to the General Theory of Algorithms
Not-So-Nearly-Minimal-Size Program Inference
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
On the Learnability of Vector Spaces
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Hi-index | 0.00 |
The present work is dedicated to the study of modes of datapresentation between text and informant within the framework of inductive inference. The model is such that the learner requests sequences of positive and negativ e data and the relations between the various formalizations in dependence on the number of switches between positive and negative data is investigated. In particular it is shown that there is a proper hierarchy of the notions of learning from standard text, in the basic switching model, in the newtext switching model and in the restart switching model. The last one of these turns out to be equivalent to the standard notion of learning from informant.