Eigenvalue decay: A new method for neural network regularization

  • Authors:
  • Oswaldo Ludwig;Urbano Nunes;Rui Araujo

  • Affiliations:
  • -;-;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper proposes two new training algorithms for multilayer perceptrons based on evolutionary computation, regularization, and transduction. Regularization is a commonly used technique for preventing the learning algorithm from overfitting the training data. In this context, this work introduces and analyzes a novel regularization scheme for neural networks (NNs) named eigenvalue decay, which aims at improving the classification margin. The introduction of eigenvalue decay led to the development of a new training method based on the same principles of SVM, and so named Support Vector NN (SVNN). Finally, by analogy with the transductive SVM (TSVM), it is proposed a transductive NN (TNN), by exploiting SVNN in order to address transductive learning. The effectiveness of the proposed algorithms is evaluated on seven benchmark datasets.