Polynomial-time data reduction for dominating set
Journal of the ACM (JACM)
A quadratic kernel for feedback vertex set
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Parameterized Complexity
Kernels for global constraints
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume One
Hi-index | 0.00 |
Preprocessing (data reduction or kernelization) as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950’s [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance I of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance I′ with |I′|I| then that would imply P=NP in classical complexity.