Communications of the ACM
Prediction-preserving reducibility
Journal of Computer and System Sciences - 3rd Annual Conference on Structure in Complexity Theory, June 14–17, 1988
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Classification by pairwise coupling
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Transforming classifier scores into accurate multiclass probability estimates
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Boosting in the presence of noise
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
One-Benefit learning: cost-sensitive learning with restricted cost information
UBDM '05 Proceedings of the 1st international workshop on Utility-based data mining
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Search-based structured prediction
Machine Learning
Unsupervised search-based structured prediction
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Robust sparse rank learning for non-smooth ranking measures
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Semi-Supervised Novelty Detection
The Journal of Machine Learning Research
Reducing position-sensitive subset ranking to classification
Canadian AI'11 Proceedings of the 24th Canadian conference on Advances in artificial intelligence
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
A simple methodology for soft cost-sensitive classification
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Generic subset ranking using binary classifiers
Theoretical Computer Science
Hi-index | 0.00 |
We introduce a reduction-based model for analyzing supervised learning tasks. We use this model to devise a new reduction from multi-class cost-sensitive classification to binary classification with the following guarantee: If the learned binary classifier has error rate at most ε then the cost-sensitive classifier has cost at most 2ε times the expected sum of costs of all possible lables. Since cost-sensitive classification can embed any bounded loss finite choice supervised learning task, this result shows that any such task can be solved using a binary classification oracle. Finally, we present experimental results showing that our new reduction outperforms existing algorithms for multi-class cost-sensitive learning.