An O(nlog log n) learning algorithm for DNF under the uniform distribution
Journal of Computer and System Sciences
The average sensitivity of bounded-depth circuits
Information Processing Letters
Every decision tree has an in.uential variable
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Agnostically learning decision trees
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Some topics in analysis of boolean functions
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
The influence of variables on Boolean functions
SFCS '88 Proceedings of the 29th Annual Symposium on Foundations of Computer Science
Improved pseudorandom generators for depth 2 circuits
APPROX/RANDOM'10 Proceedings of the 13th international conference on Approximation, and 14 the International conference on Randomization, and combinatorial optimization: algorithms and techniques
A composition theorem for the fourier entropy-influence conjecture
ICALP'13 Proceedings of the 40th international conference on Automata, Languages, and Programming - Volume Part I
Decision trees, protocols and the entropy-influence conjecture
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.00 |
In 1996, Friedgut and Kalai made the Fourier Entropy-Influence Conjecture: For every Boolean function f : {-1, 1}n → {-1, 1} it holds that H[f2] = C ċ I[f], where H[f2] is the spectral entropy of f, I[f] is the total influence of f, and C is a universal constant. In this work we verify the conjecture for symmetric functions. More generally, we verify it for functions with symmetry group Sn1 × ... × Snd where d is constant. We also verify the conjecture for functions computable by read-once decision trees.