System identification: theory for the user
System identification: theory for the user
Determining rank in the presence of error
Determining rank in the presence of error
Matrix computations (3rd ed.)
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank degeneracy and least squares problems
Rank degeneracy and least squares problems
Capabilities of a four-layered feedforward neural network: four layers versus three
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Geometrical interpretation and architecture selection of MLP
IEEE Transactions on Neural Networks
Extension of the generalization complexity measure to real valued input data sets
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
We attempt to quantify the significance of increasing the number of neurons in the hidden layer of a feedforward neural network architecture using the singular value decomposition (SVD). Through this, we extend some well-known properties of the SVD in evaluating the generalizability of single hidden layer feedforward networks (SLFNs) with respect to the number of hidden neurons. The generalization capability of the SLFN is measured by the degree of linear independency of the patterns in hidden layer space, which can be indirectly quantified from the singular values obtained from the SVD, in a post-learning step.