Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Equivalence of models for polynomial learnability
Information and Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On-line learning of functions of bounded variation under various sampling schemes
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Predicting {0, 1}-functions on randomly drawn points
Information and Computation
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
A generalization of Sauer's lemma
Journal of Combinatorial Theory Series A
General bounds on the number of examples needed for learning probabilistic concepts
Journal of Computer and System Sciences
Bounds on the Number of Examples Needed for Learning Functions
SIAM Journal on Computing
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
On the complexity of learning from drifting distributions
Information and Computation
The complexity of learning according to two models of a drifting environment
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Prediction, learning, uniform convergence, and scale-sensitive dimensions
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
The Complexity of Learning According to Two Models of a Drifting Environment
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Covering numbers for real-valued function classes
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We show that the class FBV of [0, 1]-valued functions with total variation at most 1 can be agnostically learned with respect to the absolute loss in polynomial time from O (1/ε2 log 1/δ) examples, matching a known lower bound to within a constant factor. We establish a bound of O(1/m) on the expected error of a polynomial-time algorithm for learning FBV in the prediction model, also matching a known lower bound to within a constant factor. Applying a known algorithm transformation to our prediction algorithm, we obtain a polynomial-time PAC learning algorithm for FBV with a sample complexity bound of O(1/ε log 1/δ); this also matches a known lower bound to within a constant factor.