On the Boosting Algorithm for Multiclass Functions Based on Information-Theoretic Criterion for Approxiamtion

  • Authors:
  • Eiji Takimoto;Akira Maruoka

  • Affiliations:
  • -;-

  • Venue:
  • DS '98 Proceedings of the First International Conference on Discovery Science
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the boosting technique that can be directly applied to the classification problem for multiclass functions. Although many boosting algorithms have been proposed so far, all of them are essentially developed for binary classification problems, and in order to handle multiclass classification problems, they need the problems reduced somehow to binary ones. In order to avoid such reductions, we introduce a notion of the pseudo-entropy function G that gives an information-theoretic criterion, called the conditional G-entropy, for measuring the loss of hypotheses. The conditional G-entropy turns out to be useful for defining the weakness of hypotheses that approximate, in some way, to a multiclass function in general, so that we can consider the boosting problem without reduction. We show that the top-down decision tree learning algorithm using G as its splitting criterion is an efficient boosting algorithm based on the conditional G-entropy. Namely, the algorithm intends to minimize the conditional G-entropy, rather than the classification error. In the binary case, our algorithm is identical to the error-based boosting algorithm proposed by Kearns and Mansour, and our analysis gives a simpler proof of their results.