The essential approximation order for neural networks with trigonometric hidden layer units

  • Authors:
  • Chunmei Ding;Feilong Cao;Zongben Xu

  • Affiliations:
  • Department of Information and Mathematics Sciences, College of Science, China Jiliang University, Hangzhou, Zhejiang, P.R. China;Department of Information and Mathematics Sciences, College of Science, China Jiliang University, Hangzhou, Zhejiang, P.R. China;Institute for Information and System Sciences, Faculty of Science, Xi’an Jiaotong University, Xi’an, Shaanxi, P.R. China

  • Venue:
  • ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

There have been various studies on approximation ability of feedforward neural networks. The existing studies are, however, only concerned with the density or upper bound estimation on how a multivariate function can be approximated by the networks, and consequently, the essential approximation ability of networks cannot be revealed. In this paper, by establishing both upper and lower bound estimations on approximation order, the essential approximation ability of a class of feedforward neural networks with trigonometric hidden layer units is clarified in terms of the second order modulus of smoothness of approximated function.