A limited memory algorithm for bound constrained optimization
SIAM Journal on Scientific Computing
Support Vector Machines and the Bayes Rule in Classification
Data Mining and Knowledge Discovery
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
A dual coordinate descent method for large-scale linear SVM
Proceedings of the 25th international conference on Machine learning
Batch and online learning algorithms for nonconvex neyman-pearson classification
ACM Transactions on Intelligent Systems and Technology (TIST)
Hi-index | 0.00 |
A quantile binary classifier uses the rule: Classify x as + 1 if P(Y = 1|X = x) ≥ τ, and as -1 otherwise, for a fixed quantile parameter τ ∈ [0, 1]. It has been shown that Support Vector Machines (SVMs) in the limit are quantile classifiers with τ = 1/2. In this paper, we show that by using asymmetric cost of misclassification SVMs can be appropriately extended to recover, in the limit, the quantile binary classifier for any τ. We then present a principled algorithm to solve the extended SVM classifier for all values of τ simultaneously. This has two implications: First, one can recover the entire conditional distribution P(Y = 1|X = x) = τ for τ ∈ [