On the Computational Complexity of Approximating Distributions by Probabilistic Automata
Machine Learning - Computational learning theory
Cryptographic primitives based on hard learning problems
CRYPTO '93 Proceedings of the 13th annual international cryptology conference on Advances in cryptology
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
On the learnability of discrete distributions
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Machine Learning
Quantum Finite State Transducers
SOFSEM '01 Proceedings of the 28th Conference on Current Trends in Theory and Practice of Informatics Piestany: Theory and Practice of Informatics
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
1-way quantum finite automata: strengths, weaknesses and generalizations
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
Improved constructions of quantum automata
Theoretical Computer Science
On Agnostic Learning of Parities, Monomials, and Halfspaces
SIAM Journal on Computing
Hi-index | 0.00 |
We examine the complexity of learning the distributions produced by finite-state quantum sources. We show how prior techniques for learning hidden Markov models can be adapted to the quantum generator model to find that the analogous state of affairs holds: information-theoretically, a polynomial number of samples suffice to approximately identify the distribution, but computationally, the problem is as hard as learning parities with noise, a notorious open question in computational learning theory.