On the concentration of multivariate polynomials with small expectation
Random Structures & Algorithms
Concentration of non-Lipschitz functions and applications
Random Structures & Algorithms - Probabilistic methods in combinatorial optimization
Random Structures & Algorithms - Probabilistic methods in combinatorial optimization
Convex Optimization
DOULION: counting triangles in massive graphs with a coin
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Concentration of Measure for the Analysis of Randomized Algorithms
Concentration of Measure for the Analysis of Randomized Algorithms
Pseudorandom generators for polynomial threshold functions
Proceedings of the forty-second ACM symposium on Theory of computing
Bounding the average sensitivity and noise sensitivity of polynomial threshold functions
Proceedings of the forty-second ACM symposium on Theory of computing
The Gaussian Surface Area and Noise Sensitivity of Degree-d Polynomial Threshold Functions
CCC '10 Proceedings of the 2010 IEEE 25th Annual Conference on Computational Complexity
A Regularity Lemma, and Low-Weight Approximators, for Low-Degree Polynomial Threshold Functions
CCC '10 Proceedings of the 2010 IEEE 25th Annual Conference on Computational Complexity
Correlation clustering with noisy input
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
The computational complexity of linear optics
Proceedings of the forty-third annual ACM symposium on Theory of computing
Bounded Independence Fools Halfspaces
SIAM Journal on Computing
Hi-index | 0.00 |
In this work we design a general method for proving moment inequalities for polynomials of independent random variables. Our method works for a wide range of random variables including Gaussian, Boolean, exponential, Poisson and many others. We apply our method to derive general concentration inequalities for polynomials of independent random variables. We show that our method implies concentration inequalities for some previously open problems, e.g. permanent of random symmetric matrices. We show that our concentration inequality is stronger than the well-known concentration inequality due to Kim and Vu [29]. The main advantage of our method in comparison with the existing ones is a wide range of random variables we can handle and bounds for previously intractable regimes of high degree polynomials and small expectations. On the negative side we show that even for boolean random variables each term in our concentration inequality is tight.