Learning independent components on the orthogonal group of matrices by retractions
Neural Processing Letters
Lie-group-type neural system learning by manifold retractions
Neural Networks
International Journal of Computational Intelligence Studies
An algorithm to compute averages on matrix Lie groups
IEEE Transactions on Signal Processing
Neural learning algorithms based on mappings: the case of the unitary group of matrices
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Learning by natural gradient on noncompact matrix-type pseudo-Riemannian manifolds
IEEE Transactions on Neural Networks
Riemannian optimization method on the flag manifold for independent subspace analysis
ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
An ICA learning algorithm utilizing geodesic approach
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A closed-form solution to the problem of averaging over the lie group of special orthogonal matrices
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
The aim of this contribution is to present a tutorial on learning algorithms for a single neural layer whose connection matrix belongs to the orthogonal group. The algorithms exploit geodesics appropriately connected as piece-wise approximate integrals of the exact differential learning equation. The considered learning equations essentially arise from the Riemannian-gradient-based optimization theory with deterministic and diffusion-type gradient. The paper aims specifically at reviewing the relevant mathematics (and at presenting it in as much transparent way as possible in order to make it accessible to readers that do not possess a background in differential geometry), at bringing together modern optimization methods on manifolds and at comparing the different algorithms on a common machine learning problem. As a numerical case-study, we consider an application to non-negative independent component analysis, although it should be recognized that Riemannian gradient methods give rise to general-purpose algorithms, by no means limited to ICA-related applications.