Natural gradient works efficiently in learning
Neural Computation
Information-geometric measure for neural spikes
Neural Computation
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Divergence function, duality, and convex analysis
Neural Computation
Information geometry of U-Boost and Bregman divergence
Neural Computation
Stochastic reasoning, free energy, and information geometry
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
The α-EM algorithm: surrogate likelihood maximization using α-logarithmic information measures
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Information geometry emerged from studies on invariant properties of a manifold of probability distributions. It includes convex analysis and its duality as a special but important part. Here, we begin with a convex function, and construct a dually flat manifold. The manifold possesses a Riemannian metric, two types of geodesics, and a divergence function. The generalized Pythagorean theorem and dual projections theorem are derived therefrom. We construct alpha-geometry, extending this convex analysis. In this review, geometry of a manifold of probability distributions is then given, and a plenty of applications are touched upon. Appendix presents an easily understable introduction to differential geometry and its duality.