Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
STOC '86 Proceedings of the eighteenth annual ACM symposium on Theory of computing
Learning from good and bad data
Learning from good and bad data
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Four types of noise in data for PAC learning
Information Processing Letters
The nature of statistical learning theory
The nature of statistical learning theory
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Theoretical Computer Science
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
STOC '84 Proceedings of the sixteenth annual ACM symposium on Theory of computing
ICML '06 Proceedings of the 23rd international conference on Machine learning
Efficient distribution-free learning of probabilistic concepts
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
Hi-index | 0.00 |
We introduce a framework for class noise, in which most of the known class noise models for the PAC setting can be formulated. Within this framework, we study properties of noise models that enable learning of concept classes of finite VC-dimension with the Empirical Risk Minimization (ERM) strategy. We introduce simple noise models for which classical ERM is not successful. Aiming at a more generalpurpose algorithm for learning under noise, we generalize ERM to a more powerful strategy. Finally, we study general characteristics of noise models that enable learning of concept classes of finite VC-dimension with this new strategy.