An asymptotic statistical theory of polynomial kernel methods
Neural Computation
Geometrical Properties of Nu Support Vector Machines with Different Norms
Neural Computation
Hi-index | 0.00 |
The properties of learning machines with polynomial kernel classifiers, such as support vector machines or kernel perceptrons, are examined. We first derive the number of effective examples which are related to generalization error. Next, we analyze the average prediction errors of several algorithms and show these errors do not depend on the apparent dimension of the feature space. This means that what is called the overfitting phenomena do not appear in kernel methods with polynomial kernels. © 2004 Wiley Periodicals, Inc. Syst Comp Jpn, 35(7): 41–48, 2004; Published online in Wiley InterScience (). DOI 10.1002/scj.10629