Numerical analysis: mathematics of scientific computing (2nd ed)
Numerical analysis: mathematics of scientific computing (2nd ed)
Advances in Feedforward Neural Networks: Demystifying Knowledge Acquiring Black Boxes
IEEE Transactions on Knowledge and Data Engineering
A cooperative evolutionary system for designing neural networks
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Recurrent neural networks for nonlinear output regulation
Automatica (Journal of IFAC)
IEEE Transactions on Neural Networks
A recurrent neural network for solving Sylvester equation with time-varying coefficients
IEEE Transactions on Neural Networks
Design and analysis of a general recurrent neural network model for time-varying matrix inversion
IEEE Transactions on Neural Networks
On the local minima free condition of backpropagation learning
IEEE Transactions on Neural Networks
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Hi-index | 0.00 |
Determination of appropriate neural-network (NN) structure is an important issue for a given learning or training task since the NN performance depends much on it. To remedy the weakness of conventional BP neural networks and learning algorithms, a new Laguerre orthogonal basis neural network is constructed. Based on this special structure, a weights-direct-determination method is derived, which could obtain the optimal weights of such a neural network directly (or to say, just in one step). Furthermore, a growing algorithm is presented for determining immediately the smallest number of hidden-layer neurons. Theoretical analysis and simulation results substantiate the efficacy of such a Laguerre-orthogonal-basis neural network and its growing algorithm based on the weights-direct-determination method.