A New Radial Basis Function Networks Structure: Application to Time Series Prediction

  • Authors:
  • Affiliations:
  • Venue:
  • IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 4 - Volume 4
  • Year:
  • 2000

Quantified Score

Hi-index 0.01

Visualization

Abstract

This article describes a new structure to create a RBF neural network; this new structure has four main characteristics: firstly, the special RBF network architecture uses regression weights to replace the constant weights normally used. These regression weights are assumed functions of input variables. The second characteristic is the normalization of the activation of the hidden neurons (weighted average) before aggregating the activations, which, as observed by various authors, produces better results than the classical weighted sum architecture. The third aspect is that a new type of nonlinear function is proposed: the pseudo-gaussian function (PGBF). With this, the neural system gains flexibility, as the neurons possess an activation field that does not necessarily have to be symmetric with respect to the center or to the location of the neuron in the input space. In addition to this new structure, we propose, as the fourth and final feature, a sequential learning algorithm, which is able to adapt the structure of the network; with this, it is possible to create new hidden units and also to detect and remove inactive units.