Dynamics of Generalized PCA and MCA Learning Algorithms

  • Authors:
  • Dezhong Peng;Zhang Yi

  • Affiliations:
  • Univ. of Electron. Sci. & Technol., Chengdu;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Principal component analysis (PCA) and minor component analysis (MCA) are two important statistical tools which have many applications in the fields of signal processing and data analysis. PCA and MCA neural networks (NNs) can be used to online extract principal component and minor component from input data. It is interesting to develop generalized learning algorithms of PCA and MCA NNs. Some novel generalized PCA and MCA learning algorithms are proposed in this paper. Convergence of PCA and MCA learning algorithms is an essential issue in practical applications. Traditionally, the convergence is studied via deterministic continuous-time (DCT) method. The DCT method requires the learning rate of the algorithms to approach to zero, which is not realistic in many practical applications. In this paper, deterministic discrete-time (DDT) method is used to study the dynamical behaviors of the proposed algorithms. The DDT method is more reasonable for the convergence analysis since it does not require constraints as that of the DCT method. It is proven that under some mild conditions, the weight vector in these proposed algorithms will converge exponentially to principal or minor component. Simulation results are further used to illustrate the theoretical results.