2010 | OriginalPaper | Buchkapitel
Kernelization
(Invited Talk)
verfasst von : Fedor V. Fomin
Erschienen in: Computer Science – Theory and Applications
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Preprocessing (data reduction or kernelization)
as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950’s [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance
I
of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance
I
′ with |
I
′| < |
I
| then that would imply P=NP in classical complexity.