Robust recursive least squares learning algorithm for principal component analysis

  • Authors:
  • Shan Ouyang;Zheng Bao;Gui-Sheng Liao

  • Affiliations:
  • Key Lab. for Radar Signal Process., Xidian Univ., Xi'an;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

A learning algorithm for the principal component analysis (PCA) is developed based on the least-square minimization. The dual learning rate parameters are adjusted adaptively to make the proposed algorithm capable of fast convergence and high accuracy for extracting all principal components. The proposed algorithm is robust to the error accumulation existing in the sequential PCA algorithm. We show that all information needed for PCA can he completely represented by the unnormalized weight vector which is updated based only on the corresponding neuron input-output product. The updating of the normalized weight vector can be referred to as a leaky Hebb's rule. The convergence of the proposed algorithm is briefly analyzed. We also establish the relation between Oja's rule and the least squares learning rule. Finally, the simulation results are given to illustrate the effectiveness of this algorithm for PCA and tracking time-varying directions-of-arrival