Surrogate maximization/minimization algorithms for AdaBoost and the logistic regression model

  • Authors:
  • Zhihua Zhang;James T. Kwok;Dit-Yan Yeung

  • Affiliations:
  • Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong;Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong;Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong

  • Venue:
  • ICML '04 Proceedings of the twenty-first international conference on Machine learning
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Surrogate maximization (or minimization) (SM) algorithms are a family of algorithm that can be regarded as a generalization of expectation-maximization (EM) algorithms. There are three major approaches to the construction of surrogate function, all relying on the convexity of some function. In this paper, we solve the boosting problem by proposing SM algorithms for the corresponding optimization problem. Specifically, for AdaBoost, we derive an SM algorithm that can be shown to be identical to the algorithm proposed by Collins et al. (2002) based on Bregman distance. More importantly, for LogitBoost (or logistic boosting), we use several methods to construct different surrogate functions which result in different SM algorithms. By combining multiple methods, we are able to derive an SM algorithm that is also the same as an algorithm derived by Collins et al. (2002). Our approach based on SM algorithms is much simpler and convergence results follow naturally.