Learnability of DNF with representation-specific queries
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Solving the learning parity with noise's open question
Information Processing Letters
Compressed matrix multiplication
ACM Transactions on Computation Theory (TOCT) - Special issue on innovations in theoretical computer science 2012
Hi-index | 0.00 |
Given a set of $n$ $d$-dimensional Boolean vectors with the promise that the vectors are chosen uniformly at random with the exception of two vectors that have Pearson -- correlation $\rho$ (Hamming distance $d\cdot \frac{1-\rho}{2}$), how quickly can one find the two correlated vectors? We present an algorithm which, for any constants $\eps, \rho0$ and $d \frac{\log n}{\rho^2}, $ finds the correlated pair with high probability, and runs in time $O(n^{\frac{3 \omega}{4}+\eps}) 0, $ given $n$ vectors in $\R^d$, our algorithm returns a pair of vectors whose Euclidean distance differs from that of the closest pair by a factor of at most $1+\eps, $ and runs in time $O(n^{2-\Theta(\sqrt{\eps})})$. The best previous algorithms (including LSH) have runtime $O(n^{2-O(\eps)}). $ Learning Sparse Parity with Noise: Given samples from an instance of the learning parity with noise problem where each example has length $n$, the true parity set has size at most $k n^{k(1-\frac{2}{2^k})} poly(\frac{1}{1-2\eta})$. Learning $k$-Juntas without Noise: Our results for learning sparse parities with noise imply an algorithm for learning juntas without noise with runtime $n^{\frac{\omega+ \eps}{4} k} poly(n) which improves on the runtime n !+1 ! poly(n) n0:7kpoly(n) of Mossel et al. [13].