Some weak learning results

  • Authors:
  • David P. Helmbold;Manfred K. Warmuth

  • Affiliations:
  • -;-

  • Venue:
  • COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

An algorithm is a weak learner if with some small probability itoutputs a hypothesis with error slightly below 50%. This paper presentssufficient conditions for weak learning.Our main result requires a “consistency oracle” for theconcept class F which decides for a given set of examples whetherthere is a concept in F consistent with the examples. We show that such anoracle can be used to construct a computationally efficient weaklearning algorithm for F ifF is learnable at all. We considerconsistency oracles which are allowed to give wrong answers anddiscusses how the number of incorrect answers effects the oracle's usein computationally efficient weak learning algorihms.We also define “weak Occam algorithms” which, when given a set of m examples, select aconsistent hypothesis from some class of2m-(1/p(m))possible hypotheses. We show that these weak Occam algorithms are alsoweak learners. In contrast, we show that an Occam style algorithm whichselects a consistent hypothesis from a class of2m+1-2hypotheses is not a weak learner.