Learning noisy perceptrons by a perceptron in polynomial time

  • Authors:
  • E. Cohen

  • Affiliations:
  • -

  • Venue:
  • FOCS '97 Proceedings of the 38th Annual Symposium on Foundations of Computer Science
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

Learning perceptrons (linear threshold functions) from labeled examples is an important problem in machine learning. We consider the problem where labels are subjected to random classification noise. The problem was known to be PAC learnable via a hypothesis that consists of a polynomial number of linear thresholds (due to A. Blum, A. Frieze, R. Kannan, and S. Vempala (1996)). The question of whether a hypothesis that is itself a perceptron (a single threshold function) can be found in polynomial time was open. We show that indeed, noisy perceptrons are PAC learnable with a hypothesis that is a perceptron.