A noise-detection based AdaBoost algorithm for mislabeled data

  • Authors:
  • Jingjing Cao;Sam Kwong;Ran Wang

  • Affiliations:
  • Department of Computer Science, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong;Department of Computer Science, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong;Department of Computer Science, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong

  • Venue:
  • Pattern Recognition
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Noise sensitivity is known as a key related issue of AdaBoost algorithm. Previous works exhibit that AdaBoost is prone to be overfitting in dealing with the noisy data sets due to its consistent high weights assignment on hard-to-learn instances (mislabeled instances or outliers). In this paper, a new boosting approach, named noise-detection based AdaBoost (ND-AdaBoost), is exploited to combine classifiers by emphasizing on training misclassified noisy instances and correctly classified non-noisy instances. Specifically, the algorithm is designed by integrating a noise-detection based loss function into AdaBoost to adjust the weight distribution at each iteration. A k-nearest-neighbor (k-NN) and an expectation maximization (EM) based evaluation criteria are both constructed to detect noisy instances. Further, a regeneration condition is presented and analyzed to control the ensemble training error bound of the proposed algorithm which provides theoretical support. Finally, we conduct some experiments on selected binary UCI benchmark data sets and demonstrate that the proposed algorithm is more robust than standard and other types of AdaBoost for noisy data sets.