Support Vector Machines and the Bayes Rule in Classification
Data Mining and Knowledge Discovery
Classification with a Reject Option using a Hinge Loss
The Journal of Machine Learning Research
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Uncertainty in clustering and classification
SUM'10 Proceedings of the 4th international conference on Scalable uncertainty management
A family of measures for best top-n class-selective decision rules
Pattern Recognition
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Fuzzy machine learning and data mininga
Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery
ECG beat classification using a cost sensitive classifier
Computer Methods and Programs in Biomedicine
A unified view of class-selection with probabilistic classifiers
Pattern Recognition
The data replication method for the classification with reject option
AI Communications
Hi-index | 0.00 |
In this paper, we investigate the problem of binary classification with a reject option in which one can withhold the decision of classifying an observation at a cost lower than that of misclassification. Since the natural loss function is non-convex so that empirical risk minimization easily becomes infeasible, the paper proposes minimizing convex risks based on surrogate convex loss functions. A necessary and sufficient condition for infinite sample consistency (both risks share the same minimizer) is provided. Moreover, we show that the excess risk can be bounded through the excess surrogate risk under appropriate conditions. These bounds can be tightened by a generalized margin condition. The impact of the results is illustrated on several commonly used surrogate loss functions.