Machine Learning
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A tutorial on ν-support vector machines: Research Articles
Applied Stochastic Models in Business and Industry - Statistical Learning
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
ν-support vector machine as conditional value-at-risk minimization
Proceedings of the 25th international conference on Machine learning
A modified algorithm for nonconvex support vector classification
AIA '08 Proceedings of the 26th IASTED International Conference on Artificial Intelligence and Applications
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 0.01 |
We extend the conditional geometric score (CGS) classifier of Gotoh and Takeda for binary linear classification to a nonlinear one, which we call the @b-support vector classifier (SVC), and investigate the equivalence between the @b-SVC and the (extended) @n-SVC. The CGS classifier has recently been found to be equivalent to the extended @n-SVC of Perez-Cruz et al. and, especially in the convex case, equivalent to the @n-SVC of Scholkopf et al. The CGS problem is to minimize a risk measure known as the conditional value-at-risk (@b-CVaR). In this paper, we discuss theoretical aspects, mainly generalization performance, of the @b-SVC. The formula of a generalization error bound includes the @b-CVaR or a related quantity. It implies that the minimum @b-CVaR leads to a small generalization error bound of the @b-SVC. The viewpoint of CVaR minimization is useful to ensure the validity of not only the @b-SVC but also the (extended) @n-SVC.