ν-support vector machine as conditional value-at-risk minimization

  • Authors:
  • Akiko Takeda;Masashi Sugiyama

  • Affiliations:
  • Keio University, Yokohama, Kanagawa, Japan;Tokyo Institute of Technology, Tokyo, Japan

  • Venue:
  • Proceedings of the 25th international conference on Machine learning
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ν-support vector classification (ν-SVC) algorithm was shown to work well and provide intuitive interpretations, e.g., the parameter ν roughly specifies the fraction of support vectors. Although ν corresponds to a fraction, it cannot take the entire range between 0 and 1 in its original form. This problem was settled by a non-convex extension of ν-SVC and the extended method was experimentally shown to generalize better than original ν-SVC. However, its good generalization performance and convergence properties of the optimization algorithm have not been studied yet. In this paper, we provide new theoretical insights into these issues and propose a novel ν-SVC algorithm that has guaranteed generalization performance and convergence properties.