Analog VLSI and neural systems
Analog VLSI and neural systems
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
The mathematical foundations of learning machines
The mathematical foundations of learning machines
Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Bounds for the computational power and learning complexity of analog neural nets
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Convergent algorithm for sensory receptive field development
Neural Computation
Neural nets with superlinear VC-dimension
Neural Computation
VLSI implementation of receptive fields with current-mode signal processing for smart vision sensors
Analog Integrated Circuits and Signal Processing - Special issue: current processing and current mode circuits, part I
Sample sizes for threshold networks with equivalences
Information and Computation
Classification by polynomial surfaces
Discrete Applied Mathematics
Lower bounds on the VC dimension of smoothly parameterized function classes
Neural Computation
Exact VC-dimension of Boolean monomials
Information Processing Letters
Lower bound on VC-dimension by local shattering
Neural Computation
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Neural networks with quadratic VC dimension
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Almost linear VC-dimension bounds for piecewise polynomial networks
Neural Computation
On the complexity of learning for spiking neurons with temporal coding
Information and Computation
Radial basis function networks 1: recent developments in theory and applications
Radial basis function networks 1: recent developments in theory and applications
Radial basis function networks 2: new advances in design
Radial basis function networks 2: new advances in design
Regularized radial basis functional networks: theory and applications
Regularized radial basis functional networks: theory and applications
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Vision: A Computational Investigation into the Human Representation and Processing of Visual Information
Semilinear predictability minimization produces well-known feature detectors
Neural Computation
Neural networks for optimal approximation of smooth and analytic functions
Neural Computation
Generalization and PAC learning: some new results for the class of generalized single-layer networks
IEEE Transactions on Neural Networks
RBF Neural Networks and Descartes' Rule of Signs
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Descartes' rule of signs for radial basis function neural networks
Neural Computation
Hi-index | 0.00 |
Local receptive field neurons comprise such well-known and widely used unit types as radial basis function (RBF) neurons and neurons with center-surround receptive field. We study the Vapnik-Chervonenkis (VC) dimension of feedforward neural networks with one hidden layer of these units. For several variants of local receptive field neurons, we show that the VC dimension of these networks is superlinear. In particular, we establish the bound Ω(W log k) for any reasonably sized network with W parameters and k hidden nodes. This bound is shown to hold for discrete center-surround receptive field neurons, which are physiologically relevant models of cells in the mammalian visual system, for neurons computing a difference of gaussians, which are popular in computational vision, and for standard RBF neurons, a major alternative to sigmoidal neurons in artificial neural networks. The resutlt for RBF neural networks is of particular interest since it answers a question that has been open for several years. The results also give rise to lower bounds for networks with fixed input dimension. Regarding constants, all bounds are larger than those known thus far for similar architectures with sigmoidal neurons. The superlinear lower bounds contrast with linear upper bounds for single local receptive field neurons also derived here.