Radial Basis Functions
Orthogonal nonnegative matrix t-factorizations for clustering
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Journal of VLSI Signal Processing Systems
Projected Gradient Methods for Nonnegative Matrix Factorization
Neural Computation
SIAM Journal on Matrix Analysis and Applications
Non-negative Matrix Factorization on Manifold
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Convex and Semi-Nonnegative Matrix Factorizations
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
Linear and nonlinear projective nonnegative matrix factorization
IEEE Transactions on Neural Networks
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Nonnegative Matrix Factorization (NMF) is a feature extraction technique that has already found numerous applications in machine learning and data processing. In some applications, the feature vectors or lateral components can be modeled as linear combinations of basis functions. In this paper, we are concerned with modeling the features with Gaussian Radial Basis Functions (GRBF) that have become very popular for high-dimensional data approximation or multivariate interpolation problems. To estimate the coefficients of a linear combination of GRBFs, one of the NMF subproblems is reformulated to the Quadratic Programming (QP) problem subject to inequality constraints, which is then solved with the active-set method. The experiments carried out for spectral datasets demonstrate that our approach outperforms some well-known NMF algorithms in terms of Signal-to-Interference Ratio (SIR).