Database-friendly random projections
PODS '01 Proceedings of the twentieth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
An Algorithmic Theory of Learning: Robust Concepts and Random Projection
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Database-friendly random projections: Johnson-Lindenstrauss with binary coins
Journal of Computer and System Sciences - Special issu on PODS 2001
Learning intersections and thresholds of halfspaces
Journal of Computer and System Sciences - Special issue on FOCS 2002
Unconditional lower bounds for learning intersections of halfspaces
Machine Learning
On hardness of learning intersection of two halfspaces
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Computational Complexity
Cryptographic hardness for learning intersections of halfspaces
Journal of Computer and System Sciences
Baum's Algorithm Learns Intersections of Halfspaces with Respect to Log-Concave Distributions
APPROX '09 / RANDOM '09 Proceedings of the 12th International Workshop and 13th International Workshop on Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
A discriminative model for semi-supervised learning
Journal of the ACM (JACM)
Optimal bounds for sign-representing the intersection of two halfspaces by polynomials
Proceedings of the forty-second ACM symposium on Theory of computing
A random-sampling-based algorithm for learning intersections of halfspaces
Journal of the ACM (JACM)
On the hardness of learning intersections of two halfspaces
Journal of Computer and System Sciences
Improved lower bounds for learning intersections of halfspaces
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Hi-index | 0.00 |
We present an algorithm for learning the intersection of half-spaces in n dimensions. Over nearly-uniform distributions, it runs in polynomial time for up to O(log n /log log n) half-spaces or, more generally, for any number of half-spaces whose normal vectors lie in an O(log n / log log n) dimensional subspace. Over less restricted ``non-concentrated'' distributions it runs in polynomial time for a constant number of half-spaces. This generalizes an earlier result of Blum and Kannan. The algorithm is simple and is based on random sampling.