A Bound on the Precision Required to Estimate a Boolean Perceptron from Its Average Satisfying Assignment

  • Authors:
  • Paul W. Goldberg

  • Affiliations:
  • -

  • Venue:
  • SIAM Journal on Discrete Mathematics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Boolean perceptron is a linear threshold function over the discrete Boolean domain {0,1}n. That is, it maps any binary vector to 0 or 1, depending on whether the vector's components satisfy some linear inequality. In 1961, Chow showed that any Boolean perceptron is determined by the average or "center of gravity" of its "true" vectors (those that are mapped to 1), together with the total number of true vectors. Moreover, these quantities distinguish the function from any other Boolean function, not just from other Boolean perceptrons.In this paper we go further, by identifying a lower bound on the Euclidean distance between the average satisfying assignment of a Boolean perceptron and the average satisfying assignment of a Boolean function that disagrees with that Boolean perceptron on a fraction $\epsilon$ of the input vectors. The distance between the two means is shown to be at least $(\epsilon/n)^{O(\log(n/\epsilon)\log(1/\epsilon))}$. This is motivated by the statistical question of whether an empirical estimate of this average allows us to recover a good approximation to the perceptron. Our result provides a mildly superpolynomial upper bound on the growth rate of the sample size required to learn Boolean perceptrons in the "restricted focus of attention" setting. In the process we also find some interesting geometrical properties of the vertices of the unit hypercube.