Modulated Hebb-Oja learning Rule-a method for principal subspace analysis

  • Authors:
  • M. V. Jankovic;H. Ogawa

  • Affiliations:
  • Electr. Eng. Inst. "Nikola Tesla", Belgrade, Serbia;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents analysis of the recently proposed modulated Hebb-Oja (MHO) method that performs linear mapping to a lower-dimensional subspace. Principal component subspace is the method that will be analyzed. Comparing to some other well-known methods for yielding principal component subspace (e.g., Oja's Subspace Learning Algorithm), the proposed method has one feature that could be seen as desirable from the biological point of view-synaptic efficacy learning rule does not need the explicit information about the value of the other efficacies to make individual efficacy modification. Also, the simplicity of the "neural circuits" that perform global computations and a fact that their number does not depend on the number of input and output neurons, could be seen as good features of the proposed method.