An upper bound on the minimum number of monomials required to separate dichotomies of {-1,1}n

  • Authors:
  • Erhan Oztop

  • Affiliations:
  • JST-ICORP Computational Brain Project and ATR Computational Neuroscience Laboratory, Hikari-dai Soraku-gun, Kyoto, Japan

  • Venue:
  • Neural Computation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is known that any dichotomy of {-1, 1}n can be learned (separated) with a higher-order neuron (polynomial function) with 2n inputs (monomials). In general, less than 2n monomials are sufficient to solve a given dichotomy. In spite of the efforts to develop algorithms for finding solutions with fewer monomials, there have been relatively fewer studies investigating maximum density (Π(n)), the minimum number of monomials that would suffice to separate an arbitrary dichotomy of {-1, 1}n. This article derives a theoretical (upper) bound for this quantity, superseding previously known bounds. The main theorem here states that for any binary classification problem in {-1, 1}n (n 1), one can always find a polynomial function solution with 2n - 2n/4 or fewer monomials. In particular, any dichotomy of {-1, 1)n can be learned by a higher-order neuron with a fan-in of 2n - 2n/4 or less. With this result, for the first time, a deterministic ratio bound independent of n is established as Π(n)/2n ≤ 0.75. The main theorem is constructive, so it provides a deterministic algorithm for achieving the theoretical result. The study presented provides the basic mathematical tools and forms the basis for further analyses that may have implications for neural computation mechanisms employed in the cerebral cortex.