Neural nets with superlinear VC-dimension

  • Authors:
  • Wolfgang Maass

  • Affiliations:
  • -

  • Venue:
  • Neural Computation
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

It has been known for quite a while that the Vapnik-Chervonenkisdimension (VC-dimension) of a feedforward neural net with linearthreshold gates is at most O(w · logw), where w is the total number of weights in theneural net. We show in this paper that this bound is in factasymptotically optimal. More precisely, we exhibit for any depthd ≥ 3 a large class of feedforward neural nets of depthd with w weights that have VC-dimensionΩ(w · log w). This lower bound holdseven if the inputs are restricted to Boolean values. The proof ofthis result relies on a new method that allows us to encode moreprogram-bits in the weights of a neural net than previously thoughtpossible.