Neural learning methods yielding functional invariance

  • Authors:
  • Vicente Ruiz de Angulo;Carme Torras

  • Affiliations:
  • Institut de Robòtica i Informàtica Industrial (CSIC-UPC), Pare Tecnològic de Barcelona - Edifici U, Llorens i Artigas 4-6, Barcelona 08028, Spain;Institut de Robòtica i Informàtica Industrial (CSIC-UPC), Pare Tecnològic de Barcelona - Edifici U, Llorens i Artigas 4-6, Barcelona 08028, Spain

  • Venue:
  • Theoretical Computer Science
  • Year:
  • 2004

Quantified Score

Hi-index 5.23

Visualization

Abstract

This paper investigates the functional invariance of neural network learning methods incorporating a complexity reduction mechanism, such as a regularizer. By functional invariance we mean the property of producing functionally equivalent minima as the size of the network grows, when the smoothing parameters are fixed. We study three different principles on which functional invariance can be based, and try to delimit the conditions under which each of them acts. We find out that, surprisingly, some of the most popular neural learning methods, such as weight-decay and input noise addition, exhibit this interesting property.