Preintegration lateral inhibition enhances unsupervised learning
Neural Computation
Non-negative matrix factorization with α-divergence
Pattern Recognition Letters
On α-divergence based nonnegative matrix factorization for clustering cancer gene expression data
Artificial Intelligence in Medicine
Blind Image Separation Using Nonnegative Matrix Factorization with Gibbs Smoothing
Neural Information Processing
Nonnegative Tensor Factorization with Smoothness Constraints
ICIC '08 Proceedings of the 4th international conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications - with Aspects of Theoretical and Methodological Issues
Unsupervised learning of overlapping image components using divisive input modulation
Computational Intelligence and Neuroscience
Weighted Nonnegative Matrix Co-Tri-Factorization for Collaborative Prediction
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
Adaptive harmonic spectral decomposition for multiple pitch estimation
IEEE Transactions on Audio, Speech, and Language Processing
IEEE Transactions on Neural Networks
Algorithms for nonnegative matrix factorization with the β-divergence
Neural Computation
Multistability of α-divergence based NMF algorithms
Computers & Mathematics with Applications
Non-negative residual matrix factorization: problem definition, fast solutions, and applications
Statistical Analysis and Data Mining
Selecting β-divergence for nonnegative matrix factorization by score matching
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
This letter presents a general parametric divergence measure. The metric includes as special cases quadratic error and Kullback-Leibler divergence. A parametric generalization of the two different multiplicative update rules for nonnegative matrix factorization by Lee and Seung (2001) is shown to lead to locally optimal solutions of the nonnegative matrix factorization problem with this new cost function. Numeric simulations demonstrate that the new update rule may improve the quadratic distance convergence speed. A proof of convergence is given that, as in Lee and Seung, uses an auxiliary function known from the expectation-maximization theoretical framework.