Original Contribution: Least mean square error reconstruction principle for self-organizing neural-nets

  • Authors:
  • Lei Xu

  • Affiliations:
  • -

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

We proposed a new self-organizing net based on the principle of Least Mean Square Error Reconstruction (LMSER) of an input pattern. With this principle, a local learning rule called LMSER is naturally obtained for training nets consisting of either one or several layers. We proved that for one layer with n"1 linear units, the LMSER rule lets their weights converge to rotations of the data's first n"1 principal components. These converged points are stable and corresponding to the global minimum in the Mean Square Error (MSE) landscape, which has many saddles but no local minimum. The results indirectly provided a picture about LMSER's global convergence, which is also suitable for Oja rule since we proved that the evolution direction of the Oja rule has a positive projection on that of LMSER. We have also revealed an interesting fact that slight modifications of the LMSER rule (also the Oja rule) can perform the true Principal Component Analysis (PCA) without externally designing for building asymmetrical circuits required by previous studies.