Numerical Methods Using MATLAB
Numerical Methods Using MATLAB
Growing Algorithm of Laguerre Orthogonal Basis Neural Network with Weights Directly Determined
ICIC '08 Proceedings of the 4th international conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications - with Aspects of Artificial Intelligence
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Recurrent neural networks for nonlinear output regulation
Automatica (Journal of IFAC)
IEEE Transactions on Neural Networks
A recurrent neural network for solving Sylvester equation with time-varying coefficients
IEEE Transactions on Neural Networks
Design and analysis of a general recurrent neural network model for time-varying matrix inversion
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Conventional back-propagation (BP) neural networks have some inherent weaknesses such as slow convergence and local-minima existence. Based on the polynomial interpolation and approximation theory, a special type of feedforward neural-network is constructed in this paper with hidden-layer neurons activated by Bernoulli polynomials. Different from conventional BP and gradient-based training algorithms, a weights-direct-determination (WDD) method is proposed for the Bernoulli neural network (BNN) as well, which determines the neural-network weights directly (just in one general step), without a lengthy iterative BP-training procedure. Moreover, by analyzing the relationship between BNN performance and its different number of hidden-layer neurons, a structure-automatic-determination (SAD) algorithm is further proposed, which could obtain the optimal number of hidden-layer neurons in a constructed Bernoulli neural network in the sense of achieving the highest learning-accuracy for a specific data problem or target function/system. Computer-simulations further substantiate the efficacy of such a Bernoulli neural network and its deterministic algorithms.