Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
From on-line to batch learning
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Learnability with respect to fixed distributions
Theoretical Computer Science
Predicting {0, 1}-functions on randomly drawn points
Information and Computation
Approximating hyper-rectangles: learning and pseudorandom sets
Journal of Computer and System Sciences - Fourteenth ACM SIGACT-SIGMOD-SIGART symposium on principles of database systems
On PAC learning using Winnow, Perceptron, and a Perceptron-like algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
On the sample complexity of PAC learning half-spaces against the uniform distribution
IEEE Transactions on Neural Networks
Agnostically Learning Halfspaces
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Using the doubling dimension to analyze the generalization of learning algorithms
Journal of Computer and System Sciences
The regularized least squares algorithm and the problem of learning halfspaces
Information Processing Letters
Efficient algorithms for general active learning
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Analysis of perceptron-based active learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.89 |
We show that halfspaces in n dimensions can be PAC-learned with respect to the uniform distribution with accuracy ε and confidence δ using O(1/ε- (n + log 1/δ)) examples.