Regression with radial basis function artificial neural networks using QLP decomposition to prune hidden nodes with different functional form

  • Authors:
  • Edwirde Luiz Silva;Paulo J. G. Lisboa;Andrés González Carmona

  • Affiliations:
  • Departamento de Matemática e Estatística, Universidade Estadual da Paraíba - UEPB, Paraíba, Brasil;School of Computing and Mathematical Sciences, Liverpool John Moores University, Liverpool, England;Universidad de Granada, Departamento de Estadística e Investigación Operativa, España

  • Venue:
  • NN'07 Proceedings of the 8th Conference on 8th WSEAS International Conference on Neural Networks - Volume 8
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

Radial Basis Function networks with linear output are often used in regression problems because they can be substantially faster to train than Multi-layer Perceptrons. We show how radial base Cauchy, multiquadric and Inverse multiquadric type functions can be used to approximate a rapidly changing continuous test function. In this paper, the performance of the reduced matrix design by QLP decomposition is compared with model selection criteria as the Schwartz Bayesian Information Criterion (BIC). We introduce the concept of linear basis function models and matrix design reduced by QLP decomposition, followed by an application of the QLP methodology to prune networks with different choices of radial basis function. The QLP method proves to be effective for reducing the network size by pruning hidden nodes, resulting is a parsimonious model with accurate prediction of a sinusoidal test function.