A global optimum approach for one-layer neural networks
Neural Computation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
The Journal of Machine Learning Research
Linear least-squares based methods for neural networks learning
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
IEEE Transactions on Neural Networks
Toward the scalability of neural networks through feature selection
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
The Sensitivity-Based Linear Learning Method (SBLLM) is a learning method for two-layer feedforward neural networks based on sensitivity analysis that calculates the weights by solving a linear system of equations. Therefore, there is an important saving in computational time which significantly enhances the behavior of this method as compared to other batch learning algorithms. The SBLLM works in batch mode; however, there exist several reasons that justify the need for an on-line version of this algorithm. Among them, it can be mentioned the need for real time learning for many environments in which the information is not available at the outset but rather, is continually acquired, or in those situations in which large databases have to be managed but the computing resources are limited. In this paper an incremental version of the SBLLM is presented. The theoretical basis for the method is given and its performance is illustrated by comparing the results obtained by the on-line and batch mode versions of the algorithm.