Designing nonlinear classifiers through minimizing VC dimension bound

  • Authors:
  • Jianhua Xu

  • Affiliations:
  • Department of Computer Science, Nanjing Normal University, Nanjing, China

  • Venue:
  • ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The VC dimension bound of the set of separating hyperplanes is evaluated by the ratio of squared radius of smallest sphere to squared margin. Choosing some kernel and its parameters means that the radius is fixed. In SVM with hard margin, the ratio is minimized through minimizing squared 2-norm of weight vector. In this paper, a bound for squared radius in the feature space is built, which depends on the scaling factor of RBF kernel and the squared radius bound in the input space. The squared 2-norm of weight vector is described as a quadratic form. Therefore, a simple VC dimension bound with RBF kernel is proposed for classification. Based on minimizing this bound, two constrained nonlinear programming problems are constructed for the linearly and nonlinearly separable cases. Through solving them, we can design the nonlinear classifiers with RBF kernel and determine the scaling factor of RBF kernel simultaneously.