Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Natural gradient works efficiently in learning
Neural Computation
On-line learning and stochastic approximations
On-line learning in neural networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Hi-index | 0.00 |
Since the perceptron was developed for learning to classify input patterns, there have been plenty of studies on simple perceptrons and multilayer perceptrons. Despite wide and active studies in theory and applications, multilayer perceptrons still have many unsettled problems such as slow learning speed and overfitting. To find a thorough solution to these problems, it is necessary to consolidate previous studies, and find new directions for uplifting the practical power of multilayer perceptrons. As a first step toward the new stage of studies on multilayer perceptrons, we give short reviews on two interesting and important approaches; one is stochastic approach and the other is geometric approach. We also explain an efficient learning algorithm developed from the statistical and geometrical studies, which is now well known as the natural gradient learning method.