An Algebraic Model for Generating and Adapting Neural Networks by Means of Optimization Methods

  • Authors:
  • Dolores Barrios;Daniel Manrique;M. Rosario Plaza;Juan Ríos

  • Affiliations:
  • Departamento de Inteligencia Artificial, Facultad de Informática, Campus de Montegancedo, S/N, Boadilla del Monte – 28660 Madrid, Spain;Departamento de Inteligencia Artificial, Facultad de Informática, Campus de Montegancedo, S/N, Boadilla del Monte – 28660 Madrid, Spain;Departamento de Inteligencia Artificial, Facultad de Informática, Campus de Montegancedo, S/N, Boadilla del Monte – 28660 Madrid, Spain;Departamento de Inteligencia Artificial, Facultad de Informática, Campus de Montegancedo, S/N, Boadilla del Monte – 28660 Madrid, Spain E-mail: jrios@fi.upm.es

  • Venue:
  • Annals of Mathematics and Artificial Intelligence
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a new scheme of binary codification of artificial neural networks designed to generate automatically neural networks using any optimization method. Instead of using direct mapping of strings of bits in network connectivities, this particular codification abstracts binary encoding so that it does not reference the artificial indexing of network nodes; this codification employs shorter string length and avoids illegal points in the search space, but does not exclude any legal neural network. With these goals in mind, an Abelian semi-group structure with neutral element is obtained in the set of artificial neural networks with a particular internal operation called superimposition that allows building complex neural nets from minimum useful structures. This scheme preserves the significant feature that similar neural networks only differ in one bit, which is desirable when using search algorithms. Experimental results using this codification with genetic algorithms are reported and compared to other codification methods in terms of speed of convergence and the size of the networks obtained as a solution.