Regularization theory and neural networks architectures
Neural Computation
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
Neural-network methods for boundary value problems with irregular boundaries
IEEE Transactions on Neural Networks
On the construction and training of reformulated radial basis function neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Functional equivalence between radial basis function networks and fuzzy inference systems
IEEE Transactions on Neural Networks
Computers & Mathematics with Applications
Comparison of artificial neural network architecture in solving ordinary differential equations
Advances in Artificial Neural Systems
Hi-index | 0.01 |
The gradient descent algorithms like backpropagation (BP) or its variations on multilayered feed-forward networks are widely used in many applications, especially on solving differential equations. Reformulated radial basis function networks (RBFN) are expected to have more accuracy in generalization capability than BP according to the regularization theory. We show how to apply the both networks to a specific example of differential equations and compare the capability of generalization and convergence. The experimental comparison of various approaches clarifies that reformulated RBFN shows better performance than BP for solving a specific example of differential equations.