Communications of the ACM
Teachability in computational learning
New Generation Computing - Selected papers from the international workshop on algorithmic learning theory,1990
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
On exact specification by examples
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
A computational model of teaching
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
On the power of inductive inference from good examples
Theoretical Computer Science
Journal of Computer and System Sciences
Generalized teaching dimensions and the query complexity of learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Being taught can be faster than asking questions
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Journal of Computer and System Sciences
A model of interactive teaching
Journal of Computer and System Sciences - special issue on complexity theory
Teachers, learners and black boxes
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
On the limits of efficient teachability
Information Processing Letters
Learning recursive languages from good examples
Annals of Mathematics and Artificial Intelligence
Avoiding coding tricks by hyperrobust learning
Theoretical Computer Science
Learning from Different Teachers
Machine Learning
Measuring teachability using variants of the teaching dimension
Theoretical Computer Science
Recent Developments in Algorithmic Teaching
LATA '09 Proceedings of the 3rd International Conference on Language and Automata Theory and Applications
Teaching dimension and the complexity of active learning
COLT'07 Proceedings of the 20th annual conference on Learning theory
Recursive teaching dimension, learning complexity, and maximum classes
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Sauer's bound for a notion of teaching complexity
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Massive online teaching to bounded learners
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Teaching and leading an ad hoc teammate: Collaboration without pre-coordination
Artificial Intelligence
Hi-index | 0.01 |
While most supervised machine learning models assume that training examples are sampled at random or adversarially, this article is concerned with models of learning from a cooperative teacher that selects "helpful" training examples. The number of training examples a learner needs for identifying a concept in a given class C of possible target concepts (sample complexity of C) is lower in models assuming such teachers, that is, "helpful" examples can speed up the learning process. The problem of how a teacher and a learner can cooperate in order to reduce the sample complexity, yet without using "coding tricks", has been widely addressed. Nevertheless, the resulting teaching and learning protocols do not seem to make the teacher select intuitively "helpful" examples. The two models introduced in this paper are built on what we call subset teaching sets and recursive teaching sets. They extend previous models of teaching by letting both the teacher and the learner exploit knowing that the partner is cooperative. For this purpose, we introduce a new notion of "coding trick"/"collusion". We show how both resulting sample complexity measures (the subset teaching dimension and the recursive teaching dimension) can be arbitrarily lower than the classic teaching dimension and known variants thereof, without using coding tricks. For instance, monomials can be taught with only two examples independent of the number of variables. The subset teaching dimension turns out to be nonmonotonic with respect to subclasses of concept classes. We discuss why this nonmonotonicity might be inherent in many interesting cooperative teaching and learning scenarios.