Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
STOC '99 Proceedings of the thirty-first annual ACM symposium on Theory of computing
On PAC learning using Winnow, Perceptron, and a Perceptron-like algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
A neuroidal architecture for cognitive computation
Journal of the ACM (JACM)
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Combining the Perceptron Algorithm with Logarithmic Simulated Annealing
Neural Processing Letters
PAC Analogues of Perceptron and Winnow Via Boosting the Margin
Machine Learning
An Algorithmic Theory of Learning: Robust Concepts and Random Projection
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Journal of Computer and System Sciences - STOC 2001
A simple polynomial-time rescaling algorithm for solving linear programs
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Predicting customer shopping lists from point-of-sale purchase data
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
ICML '06 Proceedings of the 23rd international conference on Machine learning
Learning Kernel Perceptrons on Noisy Data Using Random Projections
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
From annotator agreement to noise models
Computational Linguistics
Learning with annotation noise
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 1 - Volume 1
Logarithmic simulated annealing for X-ray diagnosis
Artificial Intelligence in Medicine
Hi-index | 0.00 |
Learning perceptrons (linear threshold functions) from labeled examples is an important problem in machine learning. We consider the problem where labels are subjected to random classification noise. The problem was known to be PAC learnable via a hypothesis that consists of a polynomial number of linear thresholds (due to A. Blum, A. Frieze, R. Kannan, and S. Vempala (1996)). The question of whether a hypothesis that is itself a perceptron (a single threshold function) can be found in polynomial time was open. We show that indeed, noisy perceptrons are PAC learnable with a hypothesis that is a perceptron.