Neural architectures optimization and genetic algorithms

  • Authors:
  • Mohamed Ettaouil;Youssef Ghanou

  • Affiliations:
  • Scientific calculation and Computing, Engineering sciences, Department of Mathematics and Computer science, Faculty of Science and Technology of Fez, University Sidi Mohammed ben Abdellah Fez, Mor ...;Scientific calculation and Computing, Engineering sciences, Department of Mathematics and Computer science, Faculty of Science and Technology of Fez, University Sidi Mohammed ben Abdellah Fez, Mor ...

  • Venue:
  • WSEAS Transactions on Computers
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The artificial neural networks (ANN) have proven their efficiency in several applications: pattern recognition, voice and classification problems. The training stage is very important in the ANN's performance. The selection of the architecture of a neural network suitable to solve a given problem is one of the most important aspects of neural network research. The choice of the hidden layers number and the values of weights has a large impact on the convergence of the training algorithm. In this paper we propose a mathematical formulation in order to determine the optimal number of hidden layers and good values of weights. To solve this problem, we use genetic algorithms. The numerical results assess the effectiveness of the theorical results shown in this paper and computational experiments are presented, and the advantages of the new modelling.