Exponentiated gradient versus gradient descent for linear predictors
Information and Computation
Natural gradient works efficiently in learning
Neural Computation
The Geometry of Algorithms with Orthogonality Constraints
SIAM Journal on Matrix Analysis and Applications
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Local Non-Negative Matrix Factorization as a Visual Representation
ICDL '02 Proceedings of the 2nd International Conference on Development and Learning
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Assessment of time dependency in face recognition: an initial study
AVBPA'03 Proceedings of the 4th international conference on Audio- and video-based biometric person authentication
Class-specific discriminant non-negative matrix factorization for frontal face verification
ICAPR'05 Proceedings of the Third international conference on Pattern Recognition and Image Analysis - Volume Part II
Projective nonnegative matrix factorization for image compression and feature extraction
SCIA'05 Proceedings of the 14th Scandinavian conference on Image Analysis
IEEE Transactions on Audio, Speech, and Language Processing
Linear and nonlinear projective nonnegative matrix factorization
IEEE Transactions on Neural Networks
Artificial Intelligence Review
Quadratic nonnegative matrix factorization
Pattern Recognition
Nonnegative matrix factorization on orthogonal subspace with smoothed l0 norm constrained
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Hi-index | 0.01 |
We present here how to construct multiplicative update rules for non-negative projections based on Oja's iterative learning rule. Our method integrates the multiplicative normalization factor into the original additive update rule as an additional term which generally has a roughly opposite direction. As a consequence, the modified additive learning rule can easily be converted to its multiplicative version, which maintains the non-negativity after each iteration. The derivation of our approach provides a sound interpretation of learning non-negative projection matrices based on iterative multiplicative updates-a kind of Hebbian learning with normalization. A convergence analysis is scratched by interpretating the multiplicative updates as a special case of natural gradient learning. We also demonstrate two application examples of the proposed technique, a non-negative variant of the linear Hebbian networks and a non-negative Fisher discriminant analysis, including its kernel extension. The resulting example algorithms demonstrate interesting properties for data analysis tasks in experiments performed on facial images.