Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Negative Results for Equivalence Queries
Machine Learning
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Information and Computation
Boosting a weak learning algorithm by majority
Information and Computation
Generalized teaching dimensions and the query complexity of learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
How many queries are needed to learn?
Journal of the ACM (JACM)
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
General bounds on statistical query learning and PAC learning with noise via hypothesis boosting
Information and Computation
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Machine Learning
Machine Learning
The Consistency Dimension and Distribution-Dependent Learning from Queries (Extended Abstract)
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
A General Dimension for Exact Learning
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
On using extended statistical queries to avoid membership queries
The Journal of Machine Learning Research
A characterization of strong learnability in the statistical query model
STACS'07 Proceedings of the 24th annual conference on Theoretical aspects of computer science
Learning DNF by statistical and proper distance queries under the uniform distribution
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
We extend the notion of general dimension, a combinatorial characterization of learning complexity for arbitrary query protocols, to encompass approximate learning. This immediately yields a characterization of the learning complexity in the statistical query model. As a further application, we consider approximate learning of DNF formulas and we derive close upper and lower bounds on the number of statistical queries needed. In particular, we show that with respect to the uniform distribution, and for any constant error parameter 驴 n variables and s terms) with tolerance 驴 = 驴(1/s) is n驴(log s).