Neural network architecture selection: size depends on function complexity

  • Authors:
  • Iván Gómez;Leonardo Franco;José L. Subirats;José M. Jerez

  • Affiliations:
  • Departamento de Lenguajes y Ciencias de la Computación, Universidad de Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, Universidad de Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, Universidad de Málaga, Málaga, Spain;Departamento de Lenguajes y Ciencias de la Computación, Universidad de Málaga, Málaga, Spain

  • Venue:
  • ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The relationship between generalization ability, neural network size and function complexity have been analyzed in this work. The dependence of the generalization process on the complexity of the function implemented by neural architecture is studied using a recently introduced measure for the complexity of the Boolean functions. Furthermore an association rule discovery (ARD) technique was used to find associations among subsets of items in the whole set of simulations results. The main result of the paper is that for a set of quasi-random generated Boolean functions it is found that large neural networks generalize better on high complexity functions in comparison to smaller ones, which performs better in low and medium complexity functions.