Book review: Competitively Inhibited Neural Networks for Adaptive Parameter Estimation by Michael Lemmon (Kluwer Academic Publishers, 1991)

  • Authors:
  • Joseph M. Barone

  • Affiliations:
  • Loki Software, Inc. P.O. Box 71 Liberty Corner, N.J. 07938

  • Venue:
  • ACM SIGART Bulletin
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

Rigorous, formal treatments of neural network fundamentals (i.e., treatments whose arguments consist primarily of theorems and proofs) have by now focused on a number of aspects. The convergence properties (stability) and capacity of neural nets of various types have been analyzed in this manner to one degree or another (e.g., [1-3]), and their expressive power has also been the subject of a number of formal analyses (e.g.,[4]). Though not necessarily perfectly rigorous in the sense just mentioned, formal treatments of network dynamics also exist. For reasons which will become clear below, the remarkable paper of Amari and Maginu [5] is especially worthy of note. In this paper, the authors demonstrate that the size of the "basin" of an equilibrium (learned, if you will) state in an autocorrelation associative memory stands in a complex (but predictable) relationship with the distance between the state and the input vector. This paper, along with others by Amari and his coworkers, illustrates clearly that an indepth analysis of somewhat mundane properties of neural networks may yield surprising results of considerable significance.