The application of different RBF neural network in approximation

  • Authors:
  • Jincai Chang;Long Zhao;Qianli Yang

  • Affiliations:
  • College of Science, Hebei United University, Tangshan, Hebei, China;College of Science, Hebei United University, Tangshan, Hebei, China;College of Science, Hebei United University, Tangshan, Hebei, China

  • Venue:
  • ICICA'11 Proceedings of the Second international conference on Information Computing and Applications
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The value algorithms of classical function approximation theory have a common drawback: the compute-intensive, poor adaptability, high model and data demanding and the limitation in practical applications. Neural network can calculate the complex relationship between input and output, therefore, neural network has a strong function approximation capability. This paper describes the application of RBFNN in function approximation and interpolation of scattered data. RBF neural network uses Gaussian function as transfer function widespreadly. Using it to train data set, it needs to determine the extension of radial basis function constant SPEAD. SPEAD setting is too small, there will be an over eligibility for function approximation, while SPREAD is too large, there will be no eligibility for function approximation. This paper examines the usage of different radial functions as transferinf functions to design the neural network, and analyzes their numerical applications. Simulations show that, for the same data set, Gaussian radial basis function may not be the best.