Nonnegative Matrix Factorization on Orthogonal Subspace
Pattern Recognition Letters
Projective nonnegative matrix factorization for image compression and feature extraction
SCIA'05 Proceedings of the 14th Scandinavian conference on Image Analysis
Hi-index | 0.10 |
In order to solve the problem of algorithm convergence in Projective Non-negative Matrix Factorization (P-NMF), a method, called Convergent Projective Non-negative Matrix Factorization with Kullback-Leibler Divergence (CP-NMF-DIV) is proposed. In CP-NMF-DIV, an objective function of Kullback-Leibler Divergence is considered. The Taylor series expansion and the Newton iteration formula of solving root are used. An iterative algorithm for basis matrix is derived, and a proof of algorithm convergence is provided. Experimental results show that the convergence speed of the algorithm is higher; relative to Non-negative Matrix Factorization (NMF), the orthogonality and the sparseness of the basis matrix are better, however the reconstructed results of data show that the basis matrix is still approximately orthogonal; in face recognition, there is higher recognition accuracy and it is stable in most cases which the ranks of the basis matrices are set with different values. The method for CP-NMF-DIV is effective.