Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training

  • Authors:
  • R. N. Mahdi;E. C. Rouchka

  • Affiliations:
  • Dept. of Genetic Med., Weill Cornell Med. Coll., New York, NY, USA;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large dataset with a large network structure, such optimization becomes computationally challenging. In this paper, a new regularization method that performs soft local dimension reduction in addition to weight decay is proposed. The regularized HyperBF network is shown to provide classification accuracy competitive to a support vector machine while requiring a significantly smaller network structure. Furthermore, a practical training to construct HyperBF networks is presented. Hierarchal clustering is used to initialize neurons followed by a gradient optimization using a scaled version of the Rprop algorithm with a localized partial backtracking step. Experimental results on seven datasets show that the proposed training provides faster and smoother convergence than the regular Rprop algorithm.