Communications of the ACM
Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
Linear function neurons: Structure and training
Biological Cybernetics
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Inductive inference: an abstract approach
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Teachability in computational learning
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Testing as a dual to learning
Computational learning theory: an introduction
Computational learning theory: an introduction
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Machine Learning
Machine Learning
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Geometrical concept learning and convex polytopes
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
DNF—if you can't learn'em, teach'em: an interactive model of teaching
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
A competitive approach to game learning
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Characteristic Sets for Polynomial Grammatical Inference
Machine Learning
New methods for competitive coevolution
Evolutionary Computation
Measuring teachability using variants of the teaching dimension
Theoretical Computer Science
ICMCTA '08 Proceedings of the 2nd international Castle meeting on Coding Theory and Applications
Recent Developments in Algorithmic Teaching
LATA '09 Proceedings of the 3rd International Conference on Language and Automata Theory and Applications
IWCC '09 Proceedings of the 2nd International Workshop on Coding and Cryptology
Teaching randomized learners with feedback
Information and Computation
Models of Cooperative Teaching and Learning
The Journal of Machine Learning Research
Teaching memoryless randomized learners without feedback
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Teaching learners with restricted mind changes
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Learning and verifying quantified boolean queries by example
Proceedings of the 32nd symposium on Principles of database systems
Hi-index | 0.00 |
Some recent work [7, 14, 15] in computational learning theory has discussed learning in situations where the teacher is helpful, and can choose to present carefully chosen sequences of labelled examples to the learner. We say a function t in a set H of functions (a hypothesis space) defined on a set X is specified by S***X if the only function in H which agrees with t on S is t itself. The specification number &sgr;(t) of t is the least cardinality of such an S. For a general hypothesis space, we show that the specification number of any hypotheis is at least equal to a parameter from [14] known as the testing dimension of H. We investigate in some detail the specification numbers of hypotheses in the set Hn of linearly separable boolean functions: We present general methods for finding upper bounds on &sgr;(t) and we characterise those t which have largest &sgr;(t). We obtain a general lower bound on the number of examples required and we show that for all nested hypotheses, this lower bound is attained. We prove that for any t &egr; Hn, there is exactly one set of examples of minimal cardinality (i.e., of cardinality &sgr;(t)) which specifies t. We then discuss those t &egr; Hn which have limited dependence, in the sense that some of the variables are redundant (i.e., there are irrelevant attributes), giving tight upper and lower bounds on &sgr;(t) for such hypotheses. In the final section of the paper, we address the complexity of computing specification numbers and related parameters.