A random polynomial-time algorithm for approximating the volume of convex bodies
Journal of the ACM (JACM)
Sampling and integration of near log-concave functions
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
SIAM Journal on Computing
Fast Algorithms for Logconcave Functions: Sampling, Rounding, Integration and Optimization
FOCS '06 Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
The geometry of logconcave functions and sampling algorithms
Random Structures & Algorithms
Sampling s-Concave Functions: The Limit of Convexity Based Isoperimetry
APPROX '09 / RANDOM '09 Proceedings of the 12th International Workshop and 13th International Workshop on Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques
Thin partitions: isoperimetric inequalities and a sampling algorithm for star shaped bodies
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Hi-index | 0.00 |
Let X1,X2, . . .,Xn be a set of random variables. Suppose that in addition to the prior distributions of these random variables we are also given linear constraints relating them. We ask for necessary and sufficient conditions under which we can efficiently sample the constrained distributions, find constrained marginal distributions for each of the random variables, etc. We give a tight characterization of the conditions under which this is possible. The problem is motivated by a number of scenarios where we have separate probabilistic inferences in some domain, but domain knowledge allows us to relate these inferences. When the joint prior distribution is a product distribution, the linear constraints have to be carefully chosen and are crucial in creating the lower bound instances. No such constraints are necessary if arbitrary priors are allowed.