A Versatile Hyper-Ellipsoidal Basis Function for Function Approximation in High Dimensional Space

  • Authors:
  • Saichon Jaiyen;Chidchanok Lursinsap;Suphakant Phimoltares

  • Affiliations:
  • Advance Virtual and Intelligent Computing (AVIC) Center, Department of Mathematics, Chulalongkorn University, Bangkok, Thailand 10330;Advance Virtual and Intelligent Computing (AVIC) Center, Department of Mathematics, Chulalongkorn University, Bangkok, Thailand 10330;Advance Virtual and Intelligent Computing (AVIC) Center, Department of Mathematics, Chulalongkorn University, Bangkok, Thailand 10330

  • Venue:
  • ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a versatile hyper-ellipsoidal basis function for function approximation in a given high dimensional space. This hyper-ellipsoidal basis function can be translated and rotated to cover the data based upon the distribution of data in a given high dimensional space. Based on this function, we propose a one-pass hyper-ellipsoidal learning algorithm for which any new incoming data can be fed for learning without involving the previously learned one. This learning algorithm is used to adjust the parameters of the versatile hyper-ellipsoidal basis function. In addition, we propose the hyper-ellipsoidal basis function (HEBF) neural network that uses the one-pass hyper-ellipsoidal neural learning algorithm. The structure of this neural network is similar to the radial basis function (RBF) neural networks. The hidden neurons in the HEBF neural network can be increased or decreased during learning process. The number of the hidden neurons in the network can be grown based on geometric growth criterion and can be reduced by merging the two hidden neurons into a new hidden neuron based on merging criterion during learning process. The merging process can be done independently without considering the learned data set.