Neural networks with local receptive fields and superlinear VC Dimension

  • Authors:
  • Michael Schmitt

  • Affiliations:
  • Lehrstuhl Mathematik und Informatik, Fakultät für Mathematik Ruhr-Universität Bochum, D-44780 Bochum, Germany

  • Venue:
  • Neural Computation
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Local receptive field neurons comprise such well-known and widely used unit types as radial basis function (RBF) neurons and neurons with center-surround receptive field. We study the Vapnik-Chervonenkis (VC) dimension of feedforward neural networks with one hidden layer of these units. For several variants of local receptive field neurons, we show that the VC dimension of these networks is superlinear. In particular, we establish the bound Ω(W log k) for any reasonably sized network with W parameters and k hidden nodes. This bound is shown to hold for discrete center-surround receptive field neurons, which are physiologically relevant models of cells in the mammalian visual system, for neurons computing a difference of gaussians, which are popular in computational vision, and for standard RBF neurons, a major alternative to sigmoidal neurons in artificial neural networks. The resutlt for RBF neural networks is of particular interest since it answers a question that has been open for several years. The results also give rise to lower bounds for networks with fixed input dimension. Regarding constants, all bounds are larger than those known thus far for similar architectures with sigmoidal neurons. The superlinear lower bounds contrast with linear upper bounds for single local receptive field neurons also derived here.