Stochastic Finite Learning of the Pattern Languages

  • Authors:
  • Peter Rossmanith;Thomas Zeugmann

  • Affiliations:
  • Institut für Informatik, Technische Universität München, 80290 München, Germany. rossmani@in.tum.de;Institut für Theoretische Informatik, Medizinische Universität Lübeck, Wallstr. 40, 23560 Lübeck, Germany. thomas@tcs.mu-luebeck.de

  • Venue:
  • Machine Learning
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

The present paper proposes a new learning model—called stochastic finite learning—and shows the whole class of pattern languages to be learnable within this model.This main result is achieved by providing a new and improved average-case analysis of the Lange–Wiehagen (New Generation Computing, 8, 361–370) algorithm learning the class of all pattern languages in the limit from positive data. The complexity measure chosen is the total learning time, i.e., the overall time taken by the algorithm until convergence. The expectation of the total learning time is carefully analyzed and exponentially shrinking tail bounds for it are established for a large class of probability distributions. For every pattern π containing k different variables it is shown that Lange and Wiehagen's algorithm possesses an expected total learning time of O(\hat\alpha^k \E\Lambda\log_{1/\beta}(k)), where \hat\alpha and β are two easily computable parameters arising naturally from the underlying probability distributions, and E[Λ] is the expected example string length.Finally, assuming a bit of domain knowledge concerning the underlying class of probability distributions, it is shown how to convert learning in the limit into stochastic finite learning.