Improved learning of AC0 functions
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Constant depth circuits, Fourier transform, and learnability
Journal of the ACM (JACM)
An introduction to computational learning theory
An introduction to computational learning theory
The complexity of approximating entropy
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Testing that distributions are close
FOCS '00 Proceedings of the 41st Annual Symposium on Foundations of Computer Science
Testing Random Variables for Independence and Identity
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
Sublinear algorithms for testing monotone and unimodal distributions
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Testing k-wise and almost k-wise independence
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
Algorithmic and Analysis Techniques in Property Testing
Foundations and Trends® in Theoretical Computer Science
Measuring independence of datasets
Proceedings of the forty-second ACM symposium on Theory of computing
Testing monotone continuous distributions on high-dimensional real cubes
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Hi-index | 0.00 |
A monotone distribution P over a (partially) ordered domain has P(y) ≥ P(x) if y ≥ x in the order. We study several natural problems of testing properties of monotone distributions over the n-dimensional Boolean cube, given access to random draws from the distribution being tested. We give a poly(n)-time algorithm for testing whether a monotone distribution is equivalent to or ε-far (in the L1 norm) from the uniform distribution. A key ingredient of the algorithm is a generalization of a known isoperimetric inequality for the Boolean cube. We also introduce a method for proving lower bounds on testing monotone distributions over the n-dimensional Boolean cube, based on a new decomposition technique for monotone distributions. We use this method to show that our uniformity testing algorithm is optimal up to polylog(n) factors, and also to give exponential lower bounds on the complexity of several other problems (testing whether a monotone distribution is identical to or ε-far from a fixed known monotone product distribution and approximating the entropy of an unknown monotone distribution).