A global optimum approach for one-layer neural networks
Neural Computation
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
The Journal of Machine Learning Research
Linear least-squares based methods for neural networks learning
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Regularization parameter estimation for feedforward neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
The Sensitivity-Based Linear Learning Method (SBLLM) is a learning method for two-layer feedforward neural networks based on sensitivity analysis, that calculates the weights by solving a linear system of equations. Therefore, there is an important saving in computational time which significantly enhances the behavior of this method compared to other learning algorithms. In this paper a generalization of the SBLLM that includes a regularization term in the cost function is presented. The estimation of the regularization parameter is made by means of an automatic technique. The theoretical basis for the method is given and its performance is illustrated by comparing the results obtained by the automatic technique and those obtained manually by cross-validation.