Projective Nonnegative Matrix Factorization with α -Divergence

  • Authors:
  • Zhirong Yang;Erkki Oja

  • Affiliations:
  • Department of Information and Computer Science, Helsinki University of Technology, Espoo, Finland FI-02015;Department of Information and Computer Science, Helsinki University of Technology, Espoo, Finland FI-02015

  • Venue:
  • ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α -PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to α -divergence, it gives our method more flexibility in approximation. We provide multiplicative update rules for α -PNMF and present their convergence proof. The resulting algorithm is empirically verified to give a good solution by using a variety of real-world datasets. For feature extraction, α -PNMF is able to learn highly sparse and localized part-based representations of facial images. For clustering, the new method is also advantageous over Nonnegative Matrix Factorization with α -divergence and ordinary PNMF in terms of higher purity and smaller entropy.