Convergent Projective Non-negative Matrix Factorization with Kullback-Leibler Divergence

  • Authors:
  • Lirui Hu;Liang Dai;Jianguo Wu

  • Affiliations:
  • School of Computer Science and Technology, Nantong University, Nantong 226019, China and Key Laboratory of Intelligent Computing and Signal Processing of Ministry of Education, Anhui University, H ...;Zhongyi Information Technology Co., Ltd., Nantong 226019, China;Key Laboratory of Intelligent Computing and Signal Processing of Ministry of Education, Anhui University, Hefei 230039, China and School of Computer Science and Technology, Anhui University, Hefei ...

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2014

Quantified Score

Hi-index 0.10

Visualization

Abstract

In order to solve the problem of algorithm convergence in Projective Non-negative Matrix Factorization (P-NMF), a method, called Convergent Projective Non-negative Matrix Factorization with Kullback-Leibler Divergence (CP-NMF-DIV) is proposed. In CP-NMF-DIV, an objective function of Kullback-Leibler Divergence is considered. The Taylor series expansion and the Newton iteration formula of solving root are used. An iterative algorithm for basis matrix is derived, and a proof of algorithm convergence is provided. Experimental results show that the convergence speed of the algorithm is higher; relative to Non-negative Matrix Factorization (NMF), the orthogonality and the sparseness of the basis matrix are better, however the reconstructed results of data show that the basis matrix is still approximately orthogonal; in face recognition, there is higher recognition accuracy and it is stable in most cases which the ranks of the basis matrices are set with different values. The method for CP-NMF-DIV is effective.