The nature of statistical learning theory
The nature of statistical learning theory
IEEE Transactions on Neural Networks
Advantages of Unbiased Support Vector Classifiers for Data Mining Applications
Journal of VLSI Signal Processing Systems
Hi-index | 0.00 |
This Letter discusses the application of gradient-based methods to train a single layer perceptron subject to the constraint that the saturation degree of the sigmoid activation function (measured as its maximum slope in the sample space) is fixed to a given value. From a theoretical standpoint, we show that, if the training set is not linearly separable, the minimization of an Lp error norm provides an approximation to the minimum error classifier, provided that the perceptron is highly saturated. Moreover, if data are linearly separable, the perceptron approximates the maximum margin classifier