On-line learning with an oblivious environment and the power of randomization
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Generalized teaching dimensions and the query complexity of learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
How many queries are needed to learn?
Journal of the ACM (JACM)
Machine Learning
Machine Learning
The Consistency Dimension and Distribution-Dependent Learning from Queries (Extended Abstract)
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
How Many Missing Answers Can Be Tolerated by Query Learners?
STACS '02 Proceedings of the 19th Annual Symposium on Theoretical Aspects of Computer Science
Hi-index | 0.00 |
In this paper we study the question how many queries are needed to "halve a given version space". In other words: how many queries are needed to extract from the learning environment the one bit of information that rules out fifty percent of the concepts which are still candidates for the unknown target concept. We relate this problem to the classical exact learning problem. For instance, we show that lower bounds on the number of queries needed to halve a version space also apply to randomized learners (whereas the classical adversary arguments do not readily apply). Furthermore, we introduce two new combinatorial parameters, the halving dimension and the strong halving dimension, which determine the halving complexity (modulo a small constant factor) for two popular models of query learning: learning by a minimum adequate teacher (equivalence queries combined with membership queries) and learning by counterexamples (equivalence queries alone). These parameters are finally used to characterize the additional power provided by membership queries (compared to the power of equivalence queries alone). All investigations are purely information-theoretic and ignore computational issues.