Performance surfaces of a single-layer perceptron

  • Authors:
  • J. J. Shynk

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1990

Quantified Score

Hi-index 0.00

Visualization

Abstract

A perceptron learning algorithm may be viewed as a steepest-descent method whereby an instantaneous performance function is iteratively minimized. An appropriate performance function for the most widely used perceptron algorithm is described and it is shown that the update term of the algorithm is the gradient of this function. An example is given of the corresponding performance surface based on Gaussian assumptions and it is shown that there is an infinity of stationary points. The performance surfaces of two related performance functions are examined. Computer simulations that demonstrate the convergence properties of the adaptive algorithms are given