Extracting Computational Entropy and Learning Noisy Linear Functions

  • Authors:
  • Chia-Jung Lee;Chi-Jen Lu;Shi-Chun Tsai

  • Affiliations:
  • Department of Computer Science, National Chiao-Tung University, Hsinchu, Taiwan;Institute of Information Science, Academia Sinica, Taipei, Taiwan;Department of Computer Science, National Chiao-Tung University, Hsinchu, Taiwan

  • Venue:
  • COCOON '09 Proceedings of the 15th Annual International Conference on Computing and Combinatorics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution $(f({\mathcal{X}})|{\mathcal{X}})$, for some function f and some distribution ${\mathcal{X}}$, and we say that such a source has computational min-entropy k if any circuit of size 2 k can only predict f (x ) correctly with probability at most 2*** k given input x sampled from ${\mathcal{X}}$. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multi-source extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in learning theory: learning linear functions under arbitrary distribution with adversarial noise. For this problem, we provide a learning algorithm, which may have interest of its own.