Confronting intractability via parameters

Rodney G. Downey, Dimitrios M. Thilikos

Research output: Contribution to journalReview articlepeer-review

Abstract

One approach to confronting computational hardness is to try to understand the contribution of various parameters to the running time of algorithms and the complexity of computational tasks. Almost no computational tasks in real life are specified by their size alone. It is not hard to imagine that some parameters contribute more intractability than others and it seems reasonable to develop a theory of computational complexity which seeks to exploit this fact. Such a theory should be able to address the needs of practitioners in algorithmics. The last twenty years have seen the development of such a theory. This theory has a large number of successes in terms of a rich collection of algorithmic techniques, both practical and theoretical, and a fine-grained intractability theory. Whilst the theory has been widely used in a number of areas of applications including computational biology, linguistics, VLSI design, learning theory and many others, knowledge of the area is highly varied. We hope that this article will show the basic theory and point at the wide array of techniques available. Naturally the treatment is condensed, and the reader who wants more should go to the texts of Downey and Fellows (1999) [2], Flum and Grohe (2006) [59], Niedermeier (2006) [28], and the upcoming undergraduate text (Downey and Fellows 2012) [278].

Original languageEnglish (US)
Pages (from-to)279-317
Number of pages39
JournalComputer Science Review
Volume5
Issue number4
DOIs
StatePublished - Nov 2011

Keywords

  • Parameterized algorithms
  • Parameterized complexity

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Confronting intractability via parameters'. Together they form a unique fingerprint.

Cite this