A complete and tight average-case analysis of learning monomials

  • Authors:
  • Rüdiger Reischuk;Thomas Zeugmann

  • Affiliations:
  • Institut für Theoretische Informatik, Med. Universität zu Lübeck, Lübeck, Germany;Department of Informatics, Kyushu University, Kasuga, Japan

  • Venue:
  • STACS'99 Proceedings of the 16th annual conference on Theoretical aspects of computer science
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

We advocate to analyze the average complexity of learning problems. An appropriate framework for this purpose is introduced. Based on it we consider the problem of learning monomials and the special case of learning monotone monomials in the limit and for on-line predictions in two variants: from positive data only, and from positive and negative examples. The well-known Wholist algorithm is completely analyzed, in particular its average-case behavior with respect to the class of binomial distributions. We consider different complexity measures: the number of mind changes, the number of prediction errors, and the total learning time. Tight bounds are obtained implying that worst case bounds are too pessimistic. On the average learning can be achieved exponentially faster. Furthermore, we study a new learning model, stochastic finite learning, in which, in contrast to PAC learning, some information about the underlying distribution is given and the goal is to find a correct (not only approximatively correct) hypothesis. We develop techniques to obtain good bounds for stochastic finite learning from a precise average case analysis of strategies for learning in the limit and illustrate our approach for the case of learning monomials.