Function approximation using generalized adalines

  • Authors:
  • Jiann-Ming Wu;Zheng-Han Lin;P. -H. Hsu

  • Affiliations:
  • Dept. of Appl. Math., Nat. Dong Hwa Univ., Hualien;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes neural organization of generalized adalines (gadalines) for data driven function approximation. By generalizing the threshold function of adalines, we achieve the K-state transfer function of gadalines which responds a unitary vector of K binary values to the projection of a predictor on a receptive field. A generative component that uses the K-state activation of a gadaline to trigger K posterior independent normal variables is employed to emulate stochastic predictor-oriented target generation. The fitness of a generative component to a set of paired data mathematically translates to a mixed integer and linear programming. Since consisting of continuous and discrete variables, the mathematical framework is resolved by a hybrid of the mean field annealing and gradient descent methods. Following the leave-one-out learning strategy, the obtained learning method is extended for optimizing multiple generative components. The learning result leads to parameters of a deterministic gadaline network for function approximation. Numerical simulations further test the proposed learning method with paired data oriented from a variety of target functions. The result shows that the proposed learning method outperforms the MLP and RBF learning methods for data driven function approximation