Universal approximation using radial-basis-function networks
Neural Computation
How receptive field parameters affect neural learning
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Approximation and radial-basis-function networks
Neural Computation
VC dimension and uniform learnability of sparse polynomials and rational functions
SIAM Journal on Computing
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Neural networks with quadratic VC dimension
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Radial basis function networks 1
Neural networks with local receptive fields and superlinear VC Dimension
Neural Computation
Radial Basis Function Neural Networks Have Superlinear VC Dimension
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Fast learning in networks of locally-tuned processing units
Neural Computation
Neural Network Learning: Theoretical Foundations
Neural Network Learning: Theoretical Foundations
Hi-index | 0.00 |
We establish versions of Descartes' rule of signs for radial basis function (RBF) neural networks. These RBF rules of signs provide tight bounds for the number of zeros of univariate networks with certain parameter restrictions. Moreover, they can be used to derive tight bounds for the Vapnik-Chervonenkis (VC) dimension and pseudo-dimension of these networks. In particular, we show that these dimensions are no more than linear. This result contrasts with previous work showing that RBF neural networks with two and more input nodes have superlinear VC dimension. The rules give rise also to lower bounds for network sizes, thus demonstrating the relevance of network parameters for the complexity of computing with RBF neural networks.