On-line learning with an oblivious environment and the power of randomization
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
How many queries are needed to learn?
Journal of the ACM (JACM)
The consistency dimension and distribution-dependent learning from queries
Theoretical Computer Science
Machine Learning
Machine Learning
How many query superpositions are needed to learn?
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
In this paper we study the question how many queries are needed to “halve a given version space”. In other words: how many queries are needed to extract from the learning environment the one bit of information that rules out fifty percent of the concepts which are still candidates for the unknown target concept. We relate this problem to the classical exact learning problem. For instance, we show that lower bounds on the number of queries needed to halve a version space also apply to randomized learners (whereas the classical adversary arguments do not readily apply). Furthermore, we introduce two new combinatorial parameters, the halving dimension and the strong halving dimension, which determine the halving complexity (modulo a small constant factor) for two popular models of query learning: learning by a minimum adequate teacher (equivalence queries combined with membership queries) and learning by counterexamples (equivalence queries alone). These parameters are finally used to characterize the additional power provided by membership queries (compared to the power of equivalence queries alone). All investigations are purely information-theoretic and ignore computational issues.