Fast retraining of artificial neural networks

  • Authors:
  • Dumitru-Iulian Nastac;Razvan Matei

  • Affiliations:
  • Turku Centre for Computer Science and Åbo Akademi University, Turku, Finland;Nokia Oy, Helsinki, Finland

  • Venue:
  • RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper we propose a practical mechanism for extracting information directly from the weights of a reference artificial neural network (ANN). We use this information to train a structurally identical ANN that has some variations of the global transformation input-output function. To be able to fulfill our goal, we reduce the reference network weights by a scaling factor. The evaluation of the computing effort involved in the retraining of some ANNs shows us that a good choice for the scaling factor can substantially reduce the number of training cycles independent of the learning methods. The retraining mechanism is analyzed for the feedforward ANNs with two inputs and one output.