Kernelization

  • Authors:
  • Fedor V. Fomin

  • Affiliations:
  • Department of Informatics, Univeristy of Bergen, Bergen, Norway

  • Venue:
  • CSR'10 Proceedings of the 5th international conference on Computer Science: theory and Applications
  • Year:
  • 2010
  • Kernels for global constraints

    IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume One

Quantified Score

Hi-index 0.00

Visualization

Abstract

Preprocessing (data reduction or kernelization) as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950’s [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance I of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance I′ with |I′|I| then that would imply P=NP in classical complexity.