Unbiased bits from sources of weak randomness and probabilistic communication complexity
SIAM Journal on Computing - Special issue on cryptography
A hard-core predicate for all one-way functions
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Journal of Computer and System Sciences
A Pseudorandom Generator from any One-way Function
SIAM Journal on Computing
Learning Polynomials with Queries: The Highly Noisy Case
SIAM Journal on Discrete Mathematics
Extractors and pseudorandom generators
Journal of the ACM (JACM)
Extractors: optimal up to constant factors
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
Extracting randomness from samplable distributions
FOCS '00 Proceedings of the 41st Annual Symposium on Foundations of Computer Science
Deterministic Extractors for Bit-Fixing Sources and Exposure-Resilient Cryptography
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
Extracting Randomness Using Few Independent Sources
FOCS '04 Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
Deterministic Extractors for Bit-Fixing Sources by Obtaining an Independent Seed
FOCS '04 Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
Simulating independence: new constructions of condensers, ramsey graphs, dispersers, and extractors
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Extractors with weak random seeds
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Extractors for a constant number of polynomially small min-entropy independent sources
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
Deterministic extractors for small-space sources
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
New Results for Learning Noisy Parities and Halfspaces
FOCS '06 Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
On agnostic boosting and parity learning
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Theory and application of trapdoor functions
SFCS '82 Proceedings of the 23rd Annual Symposium on Foundations of Computer Science
The bit extraction problem or t-resilient functions
SFCS '85 Proceedings of the 26th Annual Symposium on Foundations of Computer Science
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
Conditional Computational Entropy, or Toward Separating Pseudoentropy from Compressibility
EUROCRYPT '07 Proceedings of the 26th annual international conference on Advances in Cryptology
Deterministic extractors for independent-symbol sources
ICALP'06 Proceedings of the 33rd international conference on Automata, Languages and Programming - Volume Part I
IEEE Transactions on Information Theory
Extracting randomness from multiple independent sources
IEEE Transactions on Information Theory
Computational randomness from generalized hardcore sets
FCT'11 Proceedings of the 18th international conference on Fundamentals of computation theory
Hi-index | 0.00 |
We study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution $(f({\mathcal{X}})|{\mathcal{X}})$, for some function f and some distribution ${\mathcal{X}}$, and we say that such a source has computational min-entropy k if any circuit of size 2 k can only predict f (x ) correctly with probability at most 2*** k given input x sampled from ${\mathcal{X}}$. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multi-source extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in learning theory: learning linear functions under arbitrary distribution with adversarial noise. For this problem, we provide a learning algorithm, which may have interest of its own.