Increasing the Robustness of Boosting Algorithms within the Linear-programming Framework

  • Authors:
  • Yijun Sun;Sinisa Todorovic;Jian Li

  • Affiliations:
  • Interdisciplinary Center for Biotechnology Research, University of Florida, Gainesville, USA 32610-3622;Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, Urbana, USA 61801;Department of Electrical and Computer Engineering, University of Florida, Gainesville, USA 32610

  • Venue:
  • Journal of VLSI Signal Processing Systems
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

AdaBoost has been successfully used in many signal classification systems. However, it has been observed that on highly noisy data AdaBoost easily leads to overfitting, which seriously constrains its applicability. In this paper, we address this problem by proposing a new regularized boosting algorithm LPnorm2-AdaBoost (LPNA). This algorithm arises from a close connection between AdaBoost and linear programming. In the algorithm, skewness of the data distribution is controlled during the training to prevent outliers from spoiling decision boundaries. To this end, a smooth convex penalty function (l 2 norm) is introduced in the objective function of a minimax problem. A stabilized column generation technique is used to transform the optimization problem into a simple linear programming problem. The effectiveness of the proposed algorithm is demonstrated through experiments on many diverse datasets.