Feed-forward neural networks and minimal search space learning

  • Authors:
  • Roman Neruda

  • Affiliations:
  • Institute of Computer Science, Academy of Sciences of the Czech Republic, Prague, Czech Republic

  • Venue:
  • CIMMACS'05 Proceedings of the 4th WSEAS international conference on Computational intelligence, man-machine systems and cybernetics
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

A functional equivalence of feed-forward networks has been proposed to reduce the search space of learning algorithms. The description of equivalence classes has been used to introduce a unique parametrization property and consequently the so-called canonical parameterizations as representatives of functional equivalence classes. A novel genetic learning algorithm for RBF networks and perceptrons with one hidden layer that operates only on these parameterizations has been proposed. Experimental results show that our procedure outperforms the standard genetic learning. An important extension of theoretical results demonstrates that our approach is also valid in the case of approximation.