Natural gradient works efficiently in learning
Neural Computation
Quasi-Geodesic Neural Learning Algorithms Over the Orthogonal Group: A Tutorial
The Journal of Machine Learning Research
Optimal Control and Geodesics on Quadratic Matrix Lie Groups
Foundations of Computational Mathematics
Lie-group-type neural system learning by manifold retractions
Neural Networks
An algorithm to compute averages on matrix Lie groups
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
A simplified natural gradient learning algorithm
Advances in Artificial Neural Systems
Hi-index | 0.00 |
This paper deals with learning by natural-gradient optimization on noncompact manifolds. In a Riemannian manifold, the calculation of entities such as the closed form of geodesic curves over noncompact manifolds might be infeasible. For this reason, it is interesting to study the problem of learning by optimization over noncompact manifolds endowed with pseudo-Riemannian metrics, which may give rise to tractable calculations. A general theory for natural-gradient-based learning on noncompact manifolds as well as specific cases of interest of learning are discussed.