Software for estimating sparse Jacobian matrices
ACM Transactions on Mathematical Software (TOMS)
Ill-conditioning in neural network training problems
SIAM Journal on Scientific Computing
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Fitting Nature's Basic Functions Part III: Exponentials, Sinusoids, and Nonlinear Least Squares
Computing in Science and Engineering
Large scale least squares scattered data fitting
Applied Numerical Mathematics
Radial Basis Functions
A generalized learning paradigm exploiting the structure of feedforward neural networks
IEEE Transactions on Neural Networks
Mathematics and Computers in Simulation
Mathematics and Computers in Simulation
Hi-index | 0.00 |
The training of some types of neural networks leads to separable non-linear least squares problems. These problems may be ill-conditioned and require special techniques. A robust algorithm based on the Variable Projections method of Golub and Pereyra is designed for a class of feed-forward neural networks and tested on benchmark examples and real data.