Generalization performance of ν-support vector classifier based on conditional value-at-risk minimization

  • Authors:
  • Akiko Takeda

  • Affiliations:
  • Department of Administration Engineering, Keio University, 3-14-1 Hiyoshi, Kohoku, Yokohama, Kanagawa 223-8522, Japan

  • Venue:
  • Neurocomputing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

We extend the conditional geometric score (CGS) classifier of Gotoh and Takeda for binary linear classification to a nonlinear one, which we call the @b-support vector classifier (SVC), and investigate the equivalence between the @b-SVC and the (extended) @n-SVC. The CGS classifier has recently been found to be equivalent to the extended @n-SVC of Perez-Cruz et al. and, especially in the convex case, equivalent to the @n-SVC of Scholkopf et al. The CGS problem is to minimize a risk measure known as the conditional value-at-risk (@b-CVaR). In this paper, we discuss theoretical aspects, mainly generalization performance, of the @b-SVC. The formula of a generalization error bound includes the @b-CVaR or a related quantity. It implies that the minimum @b-CVaR leads to a small generalization error bound of the @b-SVC. The viewpoint of CVaR minimization is useful to ensure the validity of not only the @b-SVC but also the (extended) @n-SVC.