Kolmogorov's spline network

  • Authors:
  • B. Igelnik;N. Parikh

  • Affiliations:
  • Pegasus Technol. Inc., Mentor, OH, USA;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, an innovative neural-network architecture is proposed and elucidated. This architecture, based on the Kolmogorov's superposition theorem (1957) and called the Kolmogorov's spline network (KSN), utilizes more degrees of adaptation to data than currently used neural-network architectures (NNAs). By using cubic spline technique of approximation, both for activation and internal functions, more efficient approximation of multivariate functions can be achieved. The bound on approximation error and number of adjustable parameters, derived in this paper, favorably compares KSN with other one-hidden layer feedforward NNAs. The training of KSN, using the ensemble approach and the ensemble multinet, is described. A new explicit algorithm for constructing cubic splines is presented.