Communications of the ACM
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Learning linear threshold functions in the presence of classification noise
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Journal of the ACM (JACM)
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
On the Fourier spectrum of monotone functions
Journal of the ACM (JACM)
General bounds on statistical query learning and PAC learning with noise via hypothesis boosting
Information and Computation
Specification and simulation of statistical query algorithms for efficiency and noise tolerance
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Boosting and Hard-Core Set Construction
Machine Learning
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
Hard-core distributions for somewhat hard problems
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
On Learning Monotone Boolean Functions
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
On using extended statistical queries to avoid membership queries
The Journal of Machine Learning Research
A simple polynomial-time rescaling algorithm for solving linear programs
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Practical privacy: the SuLQ framework
Proceedings of the twenty-fourth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
New lower bounds for statistical query learning
Journal of Computer and System Sciences - Special issue on COLT 2002
CCC '07 Proceedings of the Twenty-Second Annual IEEE Conference on Computational Complexity
A general dimension for query learning
Journal of Computer and System Sciences
Unconditional lower bounds for learning intersections of halfspaces
Machine Learning
Agnostically Learning Halfspaces
SIAM Journal on Computing
Evolvability from learning algorithms
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Journal of the ACM (JACM)
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
The uniform hardcore lemma via approximate Bregman projections
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
On Agnostic Learning of Parities, Monomials, and Halfspaces
SIAM Journal on Computing
A Complete Characterization of Statistical Query Learning with Applications to Evolvability
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
A characterization of strong learnability in the statistical query model
STACS'07 Proceedings of the 24th annual conference on Theoretical aspects of computer science
Characterizing statistical query learning: simplified notions and proofs
ALT'09 Proceedings of the 20th international conference on Algorithmic learning theory
Statistical algorithms and a lower bound for detecting planted cliques
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Hi-index | 0.00 |
Statistical query (SQ) learning model of Kearns is a natural restriction of the PAC learning model in which a learning algorithm is allowed to obtain estimates of statistical properties of the examples but cannot see the examples themselves (Kearns, 1998 [29]). We describe a new and simple characterization of the query complexity of learning in the SQ learning model. Unlike the previously known bounds on SQ learning (Blum, et al., 1994; Bshouty and Feldman, 2002; Yang, 2005; Balcazar, et al., 2007; Simon, 2007 [9,11,42,3,37]) our characterization preserves the accuracy and the efficiency of learning. The preservation of accuracy implies that our characterization gives the first characterization of SQ learning in the agnostic learning framework of Haussler (1992) [23] and Kearns, Schapire and Sellie (1994) [31]. The preservation of efficiency is achieved using a new boosting technique and allows us to derive a new approach to the design of evolution algorithms in Valiant@?s model of evolvability (Valiant, 2009 [40]). We use this approach to demonstrate the existence of a large class of monotone evolution algorithms based on square loss performance estimation. These results differ significantly from the few known evolution algorithms and give evidence that evolvability in Valiant@?s model is a more versatile phenomenon than there had been previous reason to suspect.