Unsupervised learning of sigmoid perceptron

  • Authors:
  • Z. Uykan;H. N. Koivo

  • Affiliations:
  • Control Eng. Lab., Helsinki Univ. of Technol., Espoo, Finland;-

  • Venue:
  • ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 06
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

A previous paper has derived a clustering-based upper bound on mean squared output error of radial basis function networks that explicitly depends on the network parameters. In this study we focus on single-hidden-layer-sigmoid perceptron. Using the analysis of the previous paper, this paper (i) presents a similar upper bound on output error of the sigmoid perceptron and the upper bound can be made arbitrarily small by increasing the number of sigmoid units, and (ii) proposes unsupervised type learning of input-layer (synaptic) weights in contrast to traditional gradient-descent type supervised learning, i.e., the proposed method minimizes the upper bound by a clustering algorithm for determining the input-layer weights in contrast to the gradient-descent type algorithm minimizing the output error, which is traditionally used in the design of the perceptron. The simulation results show that (i) the proposed hierarchical method requires less time for learning when compared to gradient-descent-type supervised algorithm, (ii) it yields comparable performance in comparison with radial basis function network, and (iii) the upper bounds minimized during the clustering are quite tight to the output error function.