Clustering-based algorithms for single-hidden-layer sigmoid perceptron

  • Authors:
  • Z. Uykan

  • Affiliations:
  • Control Eng. Lab., Helsinki Univ. of Technol., Espoo, Finland

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gradient-descent type supervised learning is the most commonly used algorithm for design of the standard sigmoid perceptron (SP). However, it is computationally expensive (slow) and has the local-minima problem. Moody and Darken (1989) proposed an input-clustering based hierarchical algorithm for fast learning in networks of locally tuned neurons in the context of radial basis function networks. We propose and analyze input clustering (IC) and input-output clustering (IOC)-based algorithms for fast learning in networks of globally tuned neurons in the context of the SP. It is shown that "localizing'' the input layer weights of the SP by the IC and the IOC minimizes an upper bound to the SP output error. The proposed algorithms could possibly be used also to initialize the SP weights for the conventional gradient-descent learning. Simulation results offer that the SPs designed by the IC and the IOC yield comparable performance in comparison with its radial basis function network counterparts.