An Efficient Extension to Mixture Techniques for Prediction and Decision Trees

  • Authors:
  • Fernando C. Pereira;Yoram Singer

  • Affiliations:
  • AT&T Labs, 180 Park Avenue, Florham Park, NJ 07932. pereira@research.att.com;AT&T Labs, 180 Park Avenue, Florham Park, NJ 07932. singer@research.att.com

  • Venue:
  • Machine Learning
  • Year:
  • 1999

Quantified Score

Hi-index 0.06

Visualization

Abstract

We present an efficient method for maintaining mixtures of prunings of a prediction or decision tree that extends the previous methods for “node-based” prunings (Buntine, 1990; Willems, Shtarkov, & Tjalkens, 1995; Helmbold & Schapire, 1997; Singer, 1997)to the larger class of edge-based prunings. The method includes an online weight-allocation algorithm that can be used for prediction, compression and classification. Although the set of edge-based prunings of a given tree is much larger than that of node-based prunings, our algorithm has similar space and time complexity to that of previous mixture algorithms for trees. Using the general online framework of Freund and Schapire (1997), we prove that our algorithm maintains correctly the mixture weights for edge-based prunings with any bounded loss function. We also give a similar algorithm for the logarithmic loss function with a corresponding weight-allocation algorithm. Finally, we describe experiments comparing node-based and edge-based mixture models for estimating the probability of the next word in English text, which show the advantages of edge-based models.