A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices

  • Authors:
  • Yiguang Liu;Zhisheng You;Liping Cao

  • Affiliations:
  • Institute of Image and Graphics, School of Computer Science and Engineering, Sichuan University, Chengdu, 610064, PR China;Institute of Image and Graphics, School of Computer Science and Engineering, Sichuan University, Chengdu, 610064, PR China;Sichuan University Library, Sichuan University, Chengdu, 610064, PR China

  • Venue:
  • Computers & Mathematics with Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.09

Visualization

Abstract

As the efficient calculation of eigenpairs of a matrix, especially, a general real matrix, is significant in engineering, and neural networks run asynchronously and can achieve high performance in calculation, this paper introduces a recurrent neural network (RNN) to extract some eigenpair. The RNN, whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable z(t) is a complex vector. By the analytic expression of |z(t)|^2, the convergence properties of the RNN are analyzed in detail. With general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. By a rearrangement of connection matrix, the largest real part is obtained. A practice of a 7x7 matrix indicates the validity of this method. Two matrices, whose dimensionalities are 50 and 100, respectively, are employed to test the efficiency of this approach when dimension number becomes large. The results imply that the iteration number at which the network enters into equilibrium state is not sensitive with dimensionality. This RNN can be used to estimate the largest modulus of eigenvalues, etc. Compared with other neural networks designed for the similar aims, this RNN is applicable to general real matrices.