Coarse-to-fine multiclass learning and classification for time-critical domains

  • Authors:
  • Teo Susnjak;Andre Barczak;Napoleon Reyes;Ken Hawick

  • Affiliations:
  • Massey University Albany, Private Bag 102904, North Shore 0745, New Zealand;Massey University Albany, Private Bag 102904, North Shore 0745, New Zealand;Massey University Albany, Private Bag 102904, North Shore 0745, New Zealand;Massey University Albany, Private Bag 102904, North Shore 0745, New Zealand

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper presents a coarse-to-fine learning algorithm for multiclass problems. The algorithm is applied to ensemble-based learning by using boosting to construct cascades of classifiers. The goal is to address the training and detection runtime complexities found in an increasing number of classification domains. This research applies a separate-and-conquer strategy with respect to class labels, in order to realize efficiency in both the training and detection phases under limited computational resources, without compromising accuracy. The paper demonstrates how popular, non-cascaded algorithms like AdaBoost.M2, AdaBoost.OC and AdaBoost.ECC can be converted into robust cascaded classifiers. Additionally, a new multiclass weak learner is proposed that is custom designed for cascaded training. Experiments were conducted on 18 publicly available datasets and showed that the cascaded algorithms achieved considerable speed-ups over the original AdaBoost.M2, AdaBoost.OC and AdaBoost.ECC in both training and detection runtimes. The cascaded classifiers did not exhibit significant compromises in their generalization ability and in fact produced evidence of improved accuracies on datasets with biased-class distributions.