Parameter adaptation in stochastic optimization
On-line learning in neural networks
Fast curvature matrix-vector products for second-order gradient descent
Neural Computation
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
The Journal of Machine Learning Research
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
A fast semi-linear backpropagation learning algorithm
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A linear learning method for multilayer perceptrons using least-squares
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
An incremental learning method for neural networks based on sensitivity analysis
CAEPIA'09 Proceedings of the Current topics in artificial intelligence, and 13th conference on Spanish association for artificial intelligence
Hi-index | 0.00 |
This paper presents two algorithms to aid the supervised learning of feedforward neural networks. Specifically, an initialization and a learning algorithm are presented. The proposed methods are based on the independent optimization of a subnetwork using linear least squares. An advantage of these methods is that the dimensionality of the effective search space for the non-linear algorithm is reduced, and therefore it decreases the number of training epochs which are required to find a good solution. The performance of the proposed methods is illustrated by simulated examples.