Comparison of generalization ability on solving differential equations using backpropagation and reformulated radial basis function networks

  • Authors:
  • Bumghi Choi;Ju-Hong Lee

  • Affiliations:
  • School of Computer Science and Engineering, Inha University, Incheon, Republic of Korea;School of Computer Science and Engineering, Inha University, Incheon, Republic of Korea

  • Venue:
  • Neurocomputing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

The gradient descent algorithms like backpropagation (BP) or its variations on multilayered feed-forward networks are widely used in many applications, especially on solving differential equations. Reformulated radial basis function networks (RBFN) are expected to have more accuracy in generalization capability than BP according to the regularization theory. We show how to apply the both networks to a specific example of differential equations and compare the capability of generalization and convergence. The experimental comparison of various approaches clarifies that reformulated RBFN shows better performance than BP for solving a specific example of differential equations.