The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Boosting in the presence of noise
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
The weighted majority algorithm
SFCS '89 Proceedings of the 30th Annual Symposium on Foundations of Computer Science
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Learning disjunction of conjunctions
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 1
Relating reinforcement learning performance to classification performance
ICML '05 Proceedings of the 22nd international conference on Machine learning
Robust sparse rank learning for non-smooth ranking measures
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Revisiting output coding for sequential supervised learning
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Conditional probability tree estimation analysis and algorithms
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
ALT'09 Proceedings of the 20th international conference on Algorithmic learning theory
Reducing position-sensitive subset ranking to classification
Canadian AI'11 Proceedings of the 24th Canadian conference on Advances in artificial intelligence
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Multiple costs based decision making with back-propagation neural networks
Decision Support Systems
A simple methodology for soft cost-sensitive classification
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Generic subset ranking using binary classifiers
Theoretical Computer Science
Hi-index | 0.01 |
We present a reduction from cost-sensitive classification to binary classification based on (a modification of) error correcting output codes. The reduction satisfies the property that ε regret for binary classification implies l2-regret of at most 2ε for cost estimation. This has several implications: Any regret-minimizing online algorithm for 0/1 loss is (via the reduction) a regret-minimizing online cost-sensitive algorithm. In particular, this means that online learning can be made to work for arbitrary (i.e., totally unstructured) loss functions. The output of the reduction can be thresholded so that ε regret for binary classification implies at most 4${\sqrt{\epsilon Z}}$ regret for cost-sensitive classification where Z is the expected sum of costs. For multiclass problems, ε binary regret translates into l2-regret of at most 4ε in the estimation of class probabilities. For classification, this implies at most 4${\sqrt{\epsilon}}$ multiclass regret.