The VC Dimension for Mixtures of Binary Classifiers

  • Authors:
  • Wenxin Jiang

  • Affiliations:
  • Department of Statistics, Northwestern University, Evanston, IL 60208, U.S.A.

  • Venue:
  • Neural Computation
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

The mixtures-of-experts (ME) methodology provides a tool of classification when experts of logistic regression models or Bernoulli models are mixed according to a set of local weights. We show that the Vapnik-Chervonenkis dimension of the ME architecture is bounded below by the number of experts m and above by O (m4s2), where s is the dimension of the input. For mixtures of Bernoulli experts with a scalar input, we show that the lower bound m is attained, in which case we obtain the exact result that the VC dimension is equal to the number of experts.