Natural gradient works efficiently in learning
Neural Computation
Pruning using parameter and neuronal metrics
Neural Computation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
On "Natural" Learning and Pruning in Multilayered Perceptrons
Neural Computation
Hi-index | 0.00 |
In this paper, we propose an optimization method of neural networks based on the geometrical structure of neuromanifold. The optimizing process starts from the manifold of sufficiently large network model. In the manifold of the given network structure, we first find an optimal point, which achieves good generalization performance. To do this, we propose an extension of the adaptive natural gradient learning with regularization term. Using hierarchical structure of neuromanifold, we then try to optimize the network structure. To do this, we apply the natural pruning method starting from the current optimal parameter point. The whole optimization process can be explained from the geometrical point of view. We confirm the generalization performance of the optimized network by the proposed method through experiments on benchmark data sets.