Error limiting reductions between classification tasks

  • Authors:
  • Alina Beygelzimer;Varsha Dani;Tom Hayes;John Langford;Bianca Zadrozny

  • Affiliations:
  • IBM T. J. Watson Research Center, Hawthorne, NY;University of Chicago, Chicago, IL;University of California, Berkeley, CA;TTI-Chicago, Chicago, IL;IBM T. J. Watson Research Center, Yorktown Height, NY

  • Venue:
  • ICML '05 Proceedings of the 22nd international conference on Machine learning
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a reduction-based model for analyzing supervised learning tasks. We use this model to devise a new reduction from multi-class cost-sensitive classification to binary classification with the following guarantee: If the learned binary classifier has error rate at most ε then the cost-sensitive classifier has cost at most 2ε times the expected sum of costs of all possible lables. Since cost-sensitive classification can embed any bounded loss finite choice supervised learning task, this result shows that any such task can be solved using a binary classification oracle. Finally, we present experimental results showing that our new reduction outperforms existing algorithms for multi-class cost-sensitive learning.