Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Information and Computation
General bounds on statistical query learning and PAC learning with noise via hypothesis boosting
Information and Computation
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
On Learning Correlated Boolean Functions Using Statistical Queries
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
New Lower Bounds for Statistical Query Learning
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
New Lower Bounds for Statistical Query Learning
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
On using extended statistical queries to avoid membership queries
The Journal of Machine Learning Research
The complexity of learning concept classes with polynomial general dimension
Theoretical Computer Science - Algorithmic learning theory(ALT 2002)
A Complete Characterization of Statistical Query Learning with Applications to Evolvability
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
A characterization of strong learnability in the statistical query model
STACS'07 Proceedings of the 24th annual conference on Theoretical aspects of computer science
A complete characterization of statistical query learning with applications to evolvability
Journal of Computer and System Sciences
Statistical algorithms and a lower bound for detecting planted cliques
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Hi-index | 0.00 |
The Statistical Query model was introduced in [6] to handle noise in the well-known PAC model. In this model the learner gains information about the target concept by asking for various statistics about it. Characterizing the number of queries required by learning a given concept class under fixed distribution was already considered in [3] for weak learning; then in [8] strong learnability was also characterized. However, the proofs for these results in [3,10,8] (and for strong learnability even the characterization itself) are rather complex; our main goal is to present a simple approach that works for both problems. Additionally, we strengthen the result on strong learnability by showing that a class is learnable with polynomially many queries iff all consistent algorithms use polynomially many queries, and by showing that proper and improper learning are basically equivalent. As an example, we apply our results on conjunctions under the uniform distribution.