Communications of the ACM
Information and Computation
Computational limitations on learning from examples
Journal of the ACM (JACM)
Teachability in computational learning
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
Machine learning: a theoretical approach
Machine learning: a theoretical approach
On exact specification by examples
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
A computational model of teaching
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
On the power of inductive inference from good examples
Theoretical Computer Science
Learning binary relations and total orders
SIAM Journal on Computing
Journal of Computer and System Sciences
On specifying Boolean functions by labelled examples
Discrete Applied Mathematics
Generalized teaching dimensions and the query complexity of learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Being taught can be faster than asking questions
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Witness sets for families of binary vectors
Journal of Combinatorial Theory Series A
Lower bounds on learning decision lists and trees
Information and Computation
Journal of Computer and System Sciences
A model of interactive teaching
Journal of Computer and System Sciences - special issue on complexity theory
Teachers, learners and black boxes
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Decision lists and related Boolean functions
Theoretical Computer Science
On the limits of efficient teachability
Information Processing Letters
Machine Learning
Machine Learning
Machine Learning
Learning from Different Teachers
Machine Learning
Combinatorial Results on the Complexity of Teaching and Learning
MFCS '94 Proceedings of the 19th International Symposium on Mathematical Foundations of Computer Science 1994
Learning decision lists and trees with equivalence-queries
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Learning of R.E. Languages from Good Examples
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
DNF are teachable in the average case
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Teaching classes with high teaching dimension using few examples
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Recent Developments in Algorithmic Teaching
LATA '09 Proceedings of the 3rd International Conference on Language and Automata Theory and Applications
Recursive teaching dimension, learning complexity, and maximum classes
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Models of Cooperative Teaching and Learning
The Journal of Machine Learning Research
Massive online teaching to bounded learners
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Hi-index | 5.23 |
In a typical algorithmic learning model, a learner has to identify a target object from partial information. Conversely, in a teaching model a teacher has to give information that allows the learners to identify a target object. We devise two variants of the classical teaching model for Boolean concept classes, based on the teaching dimension, and describe them by teaching-dimension-like combinatorial parameters. In the first model, the learners choose consistent hypotheses with least complexity. We show that 1-decision lists are the harder to teach the longer they are and that 2-term DNFs are the harder to teach the more terms they have. This contrasts with the teachability results for these classes in the teaching-dimension model. In our second model, the learners choose consistent hypotheses based on the assumption that the teacher is optimal. We show that monomials can be taught with a linear number of examples, whereas some 1-decision lists need exponentially many.