A general lower bound on the number of examples needed for learning
Information and Computation
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
An elementary proof of a theorem of Johnson and Lindenstrauss
Random Structures & Algorithms
The Journal of Machine Learning Research
A model of inductive bias learning
Journal of Artificial Intelligence Research
On the sample complexity of PAC learning half-spaces against the uniform distribution
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We give a lower bound for the error of any unitarily invariant algorithm learning half-spaces against the uniform or related distributions on the unit sphere. The bound is uniform in the choice of the target half-space and has an exponentially decaying deviation probability in the sample. The technique of proof is related to a proof of the Johnson Lindenstrauss Lemma. We argue that, unlike previous lower bounds, our result is well suited to evaluate the benefits of multi-task or transfer learning, or other cases where an expense in the acquisition of domain knowledge has to be justified.