Approximation properties of local bases assembled from neural network transfer functions

  • Authors:
  • A. J. Meade, Jr.;B. A. Zeldin

  • Affiliations:
  • Department of Mechanical Engineering and Materials Science, Mail Stop 321 William Marsh Rice University, Houston, TX 77251-1892, U.S.A.;Department of Mechanical Engineering and Materials Science, Mail Stop 321 William Marsh Rice University, Houston, TX 77251-1892, U.S.A.

  • Venue:
  • Mathematical and Computer Modelling: An International Journal
  • Year:
  • 1998

Quantified Score

Hi-index 0.98

Visualization

Abstract

The adaptive data-driven emulation and control of mechanical systems are popular applications of artificial neural networks in engineering. However, multilayer perceptron training is an ill-posed nonlinear optimization problem. This paper explores a method to constrain network parameters so that conventional computational techniques for function approximation can be used during training. This was accomplished by forming local basis functions which provide accurate approximation and stable evaluation of the network parameters. It is noted that this approach is quite general and does not violate the principles of network architecture. By employing the concept of shift-invariant subspaces, this approach yields a new and more robust error condition for feedforward artificial neural networks and allows one to both characterize and control the accuracy of the local bases formed. The two methods used are: (1) adding bases while altering their shape and keeping their spacing constant, and (2) adding bases while altering their shape and decreasing their spacing in a coupled fashion. Numerical examples demonstrate the usefulness of the proposed approximation of functions and their derivatives.