An incremental learning method for neural networks based on sensitivity analysis

  • Authors:
  • Beatriz Pérez-Sánchez;Oscar Fontenla-Romero;Bertha Guijarro-Berdiñas

  • Affiliations:
  • Department of Computer Science, Faculty of Informatics, University of A Coruña, A Coruña, Spain;Department of Computer Science, Faculty of Informatics, University of A Coruña, A Coruña, Spain;Department of Computer Science, Faculty of Informatics, University of A Coruña, A Coruña, Spain

  • Venue:
  • CAEPIA'09 Proceedings of the Current topics in artificial intelligence, and 13th conference on Spanish association for artificial intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Sensitivity-Based Linear Learning Method (SBLLM) is a learning method for two-layer feedforward neural networks based on sensitivity analysis that calculates the weights by solving a linear system of equations. Therefore, there is an important saving in computational time which significantly enhances the behavior of this method as compared to other batch learning algorithms. The SBLLM works in batch mode; however, there exist several reasons that justify the need for an on-line version of this algorithm. Among them, it can be mentioned the need for real time learning for many environments in which the information is not available at the outset but rather, is continually acquired, or in those situations in which large databases have to be managed but the computing resources are limited. In this paper an incremental version of the SBLLM is presented. The theoretical basis for the method is given and its performance is illustrated by comparing the results obtained by the on-line and batch mode versions of the algorithm.