Advances in Minimum Description Length: Theory and Applications (Neural Information Processing)
Advances in Minimum Description Length: Theory and Applications (Neural Information Processing)
Estimation of Non-Normalized Statistical Models by Score Matching
The Journal of Machine Learning Research
Some extensions of score matching
Computational Statistics & Data Analysis
A Generalized Divergence Measure for Nonnegative Matrix Factorization
Neural Computation
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Algorithms for nonnegative matrix factorization with the β-divergence
Neural Computation
Hi-index | 0.00 |
Nonnegative Matrix Factorization (NMF) based on the family of β-divergences has shown to be advantageous in several signal processing and data analysis tasks. However, how to automatically select the best divergence among the family for given data remains unknown. Here we propose a new estimation criterion to resolve the problem of selecting β. Our method inserts the point estimate of factorizing matrices from β-NMF into a Tweedie distribution that underlies β-divergence. Next, we adopt a recent estimation method called Score Matching for β selection in order to overcome the difficulty of calculating the normalizing constant in Tweedie distribution. Our method is tested on both synthetic and real-world data. Experimental results indicate that our selection criterion can accurately estimate β compared to ground truth or established research findings.