An interval approach for weight's initialization of feedforward neural networks

  • Authors:
  • Marcela Jamett;Gonzalo Acuña

  • Affiliations:
  • Departamento de Diseño, Universidad Tecnológica Metropolitana, UTEM, Santiago, Chile;Departamento de Ingeniería Informática, Universidad de Santiago de Chile, USACH, Santiago, Chile

  • Venue:
  • MICAI'06 Proceedings of the 5th Mexican international conference on Artificial Intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work addresses an important problem in Feedforward Neural Networks (FNN) training, i.e. finding the pseudo-global minimum of the cost function, assuring good generalization properties to the trained architecture. Firstly, pseudo-global optimization is achieved by employing a combined parametric updating algorithm which is supported by the transformation of network parameters into interval numbers. It solves the network weight initialization problem, performing an exhaustive search for minimums by means of Interval Arithmetic (IA). Then, the global minimum is obtained once the search has been limited to the region of convergence (ROC). IA allows representing variables and parameters as compact-closed sets, then, a training procedure using interval weights can be done. The methodology developed is exemplified by an approximation of a known non-linear function in last section.