Communications of the ACM
Algorithms in combinatorial geometry
Algorithms in combinatorial geometry
Information Processing Letters
Learning decision trees from random examples needed for learning
Information and Computation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
On learning a union of half spaces
Journal of Complexity
The perceptron algorithm is fast for nonmalicious distributions
Neural Computation
Training a 3-node neural network in NP-complete
Advances in neural information processing systems 1
Efficient noise-tolerant learning from statistical queries
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Statistical queries and faulty PAC oracles
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Almost optimal set covers in finite VC-dimension: (preliminary version)
SCG '94 Proceedings of the tenth annual symposium on Computational geometry
Composite geometric concepts and polynomial predictability
Information and Computation
Efficient NC algorithms for set cover with applications to learning and geometry
Proceedings of the 30th IEEE symposium on Foundations of computer science
Four types of noise in data for PAC learning
Information Processing Letters
Specification and simulation of statistical query algorithms for efficiency and noise tolerance
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Learning from a consistently ignorant teacher
Journal of Computer and System Sciences
PAC learning intersections of halfspaces with membership queries (extended abstract)
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Learning of depth two neural networks with constant fan-in at the hidden nodes (extended abstract)
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A composition theorem for learning algorithms with applications to geometric concept classes
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
A new composition theorem for learning algorithms
STOC '98 Proceedings of the thirtieth annual ACM symposium on Theory of computing
Noise-tolerant parallel learning of geometric concepts
Information and Computation
Exact Learning of Discretized Geometric Concepts
SIAM Journal on Computing
Machine Learning
Machine Learning
Machine Learning
Learning fixed-dimension linear thresholds from fragmented data
Information and Computation
When Can Two Unsupervised Learners Achieve PAC Separation?
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Bias-variance tradeoffs in program analysis
Proceedings of the 41st ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages
Hi-index | 0.00 |
We present an efficient algorithm for PAC-learning a very general class of geometric concepts over R d for fixed d. More specifically, let T be any set of s halfspaces. Let x =(x1, …, xd) be an arbitrary point in R d. With each t∈T we associate a boolean indicator function It(x) which is 1 if and only if x is in the halfspace t. The concept class, Cds , that we study consists of all concepts formed by any Boolean function over It1, …, Its for ti ∈T . This class is much more general than any geometric concept class known to be PAC-learnable. Our results can be extended easily to learn efficiently any Boolean combination of a polynomial number of concepts selected from any concept class C over R given that the VC-dimension of C has dependence only on d and there is a polynomial time algorithm to determine if there is a concept from C consistent with a given set of labeled examples. We also present a statistical query version of our algorithm that can tolerate random classification noise. Finally we present a generalization of the standard &egr;-net result of Haussler and Welzl [1987] and apply it to give an alternative noise-tolerant algorithm for d = 2 based on geometric subdivisions.