Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
How good are convex hull algorithms?
Computational Geometry: Theory and Applications
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Robust Classification for Imprecise Environments
Machine Learning
Preventing "Overfitting" of Cross-Validation Data
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Restricted Bayes Optimal Classifiers
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Computational Techniques of the Simplex Method
Computational Techniques of the Simplex Method
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
An Efficient Implementation of an Active Set Method for SVMs
The Journal of Machine Learning Research
Greedy-based design of sparse two-stage SVMs for fast classification
Proceedings of the 29th DAGM conference on Pattern recognition
Nested support vector machines
IEEE Transactions on Signal Processing
Multiple incremental decremental learning of support vector machines
IEEE Transactions on Neural Networks
Adapting cost-sensitive learning for reject option
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
The Journal of Machine Learning Research
Batch and online learning algorithms for nonconvex neyman-pearson classification
ACM Transactions on Intelligent Systems and Technology (TIST)
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Classifying circuit performance using active-learning guided support vector machines
Proceedings of the International Conference on Computer-Aided Design
Hi-index | 0.00 |
Receiver Operating Characteristic (ROC) curves are a standard way to display the performance of a set of binary classifiers for all feasible ratios of the costs associated with false positives and false negatives. For linear classifiers, the set of classifiers is typically obtained by training once, holding constant the estimated slope and then varying the intercept to obtain a parameterized set of classifiers whose performances can be plotted in the ROC plane. We consider the alternative of varying the asymmetry of the cost function used for training. We show that the ROC curve obtained by varying both the intercept and the asymmetry, and hence the slope, always outperforms the ROC curve obtained by varying only the intercept. In addition, we present a path-following algorithm for the support vector machine (SVM) that can compute efficiently the entire ROC curve, and that has the same computational complexity as training a single classifier. Finally, we provide a theoretical analysis of the relationship between the asymmetric cost model assumed when training a classifier and the cost model assumed in applying the classifier. In particular, we show that the mismatch between the step function used for testing and its convex upper bounds, usually used for training, leads to a provable and quantifiable difference around extreme asymmetries.