Saturated Perceptrons for Maximum Margin and Minimum Misclassification Error

  • Authors:
  • Jesús Cid-Sueiro;José L. Sancho-Gómez

  • Affiliations:
  • Dpto. de Tecnologías de las Comunicaciones, Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganés-Madrid, Spain. E-mail: jcid@tsc.uc3m.es;Dpto. de Tecnologías de las Comunicaciones, Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganés-Madrid, Spain

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This Letter discusses the application of gradient-based methods to train a single layer perceptron subject to the constraint that the saturation degree of the sigmoid activation function (measured as its maximum slope in the sample space) is fixed to a given value. From a theoretical standpoint, we show that, if the training set is not linearly separable, the minimization of an Lp error norm provides an approximation to the minimum error classifier, provided that the perceptron is highly saturated. Moreover, if data are linearly separable, the perceptron approximates the maximum margin classifier