Baum's Algorithm Learns Intersections of Halfspaces with Respect to Log-Concave Distributions
APPROX '09 / RANDOM '09 Proceedings of the 12th International Workshop and 13th International Workshop on Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
Bounding the average sensitivity and noise sensitivity of polynomial threshold functions
Proceedings of the forty-second ACM symposium on Theory of computing
An invariance principle for polytopes
Proceedings of the forty-second ACM symposium on Theory of computing
A random-sampling-based algorithm for learning intersections of halfspaces
Journal of the ACM (JACM)
Learning and lower bounds for AC0 with threshold gates
APPROX/RANDOM'10 Proceedings of the 13th international conference on Approximation, and 14 the International conference on Randomization, and combinatorial optimization: algorithms and techniques
Submodular functions are noise stable
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Hardness results for agnostically learning low-degree polynomial threshold functions
Proceedings of the twenty-second annual ACM-SIAM symposium on Discrete Algorithms
An invariance principle for polytopes
Journal of the ACM (JACM)
Hi-index | 0.00 |
We study the learnability of sets in R^n under the Gaussiandistribution, taking Gaussian surface area as the``complexity measure'' of the sets being learned. Let C_Sdenote the class of all (measurable) sets with surface area at most S.We first show that the class C_S is learnable to any constant accuracy in time n^{O(S^2)}, even in the arbitrary noise (``agnostic'') model. Complementing this, we also show that any learning algorithm for C_S information-theoretically requires 2^{\Omega(S^2)} examples for learning to constant accuracy. These results together show that Gaussian surface area essentially characterizes the computational complexity of learning under the Gaussian distribution. \\Our approach yields several new learning results, including the following (all bounds are for learning to any constant accuracy):The class of all convex sets can be agnostically learned in time 2^{\tilde{O}(\sqrt{n})} (and we prove a 2^{\Omega(\sqrt{n})} lower bound for noise-free learning). This is the first subexponential time algorithm for learning general convex sets even in the noise-free (PAC) model.Intersections of k halfspaces can be agnostically learned in time n^{O(\log k)} (cf. Vempala's n^{O(k)} time algorithm for learning in the noise-free model).Cones (with apex centered at the origin), and spheres witharbitrary radius and center, can be agnostically learned in time poly(n).