Algorithms for random generation and counting: a Markov chain approach
Algorithms for random generation and counting: a Markov chain approach
Independent sets versus perfect matchings
Theoretical Computer Science
Randomized algorithms
A computational view of population genetics
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
The Markov chain Monte Carlo method: an approach to approximate counting and integration
Approximation algorithms for NP-hard problems
SFCS '92 Proceedings of the 33rd Annual Symposium on Foundations of Computer Science
Markov chains and polynomial time algorithms
SFCS '94 Proceedings of the 35th Annual Symposium on Foundations of Computer Science
Hi-index | 0.00 |
In this paper we present a randomized parallel algorithm to sample matchings from an almost uniform distribution on the set of matchings of all sizes in a graph. First we prove that the direct NC simulation of the sequential Markov chain technique for this problem is P-complete. Afterwards we present a randomized parallel algorithm for the problem. The technique used is based on the definition of a genetic system that converges to the uniform distribution. The system evolves according to a non-linear equation. Little is known about the convergence of these systems. We can define a non-linear system which converges to a stationary distribution under quite natural conditions. We prove convergence for the system corresponding to the almost uniform sampling of matchings in a graph (up to know the only known convergence for non-linear systems for matchings was matchings on a tree [5]). We give empirical evidence that the system converges faster, in polylogarithmic parallel time.