Communications of the ACM
An introduction to computational learning theory
An introduction to computational learning theory
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Learning noisy perceptrons by a perceptron in polynomial time
FOCS '97 Proceedings of the 38th Annual Symposium on Foundations of Computer Science
A polynomial-time algorithm for learning noisy linear threshold functions
FOCS '96 Proceedings of the 37th Annual Symposium on Foundations of Computer Science
Some Discriminant-Based PAC Algorithms
The Journal of Machine Learning Research
Learning Kernel Perceptrons on Noisy Data Using Random Projections
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
PAC-Learning with general class noise models
KI'12 Proceedings of the 35th Annual German conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
We address the issue of the learnability of concept classes under three classification noise models in the probably approximately correct framework. After introducing the Class-Conditional Classification Noise (CCCN) model, we investigate the problem of the learnability of concept classes under this particular setting and we show that concept classes that are learnable under the well-known uniform classification noise (CN) setting are also CCCN-learnable, which gives CN = CCCN. We then use this result to prove the equality between the set of concept classes that are CN-learnable and the set of concept classes that are learnable in the Constant Partition Classification Noise (CPCN) setting, or, in other words, we show that CN = CPCN.