Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
The Optimum Class-Selective Rejection Rule
IEEE Transactions on Pattern Analysis and Machine Intelligence
Classification by pairwise coupling
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Support Vector Machines with Embedded Reject Option
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
A New Approach of Modifying SVM Outputs
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 6 - Volume 6
Reducing the classification cost of support vector classifiers through an ROC-based reject rule
Pattern Analysis & Applications
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
Dual /spl nu/-support vector machine with error rate and training size biasing
ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 200. on IEEE International Conference - Volume 02
Journal of Artificial Intelligence Research
On optimum recognition error and reject tradeoff
IEEE Transactions on Information Theory
Performance Measures for Neyman–Pearson Classification
IEEE Transactions on Information Theory
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
Batch and online learning algorithms for nonconvex neyman-pearson classification
ACM Transactions on Intelligent Systems and Technology (TIST)
The data replication method for the classification with reject option
AI Communications
Hi-index | 0.10 |
In this paper, the problem of binary classification is studied with one or two performance constraints. When the constraints cannot be satisfied, the initial problem has no solution and an alternative problem is solved by introducing a rejection option. The optimal solution for such problems in the framework of statistical hypothesis testing is shown to be based on likelihood ratio with one or two thresholds depending on whether it is necessary to introduce a rejection option or not. These problems are then addressed when classes are only defined by labelled samples. To illustrate the resolution of cases with and without rejection option, the problem of Neyman-Pearson and the one of minimizing reject probability subject to a constraint on error probability are studied. Solutions based on SVMs and on a kernel based classifier are experimentally compared and discussed.