Tightening LP relaxations for MAP using message passing

David Sontag, Talya Meltzer, Amir Globerson, Tommi Jaakkola, Yair Weiss

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Linear Programming (LP) relaxations have become powerful tools for finding the most probable (MAP) configuration in graphical models. These relaxations can be solved efficiently using message-passing algorithms such as belief propagation and, when the relaxation is tight, provably find the MAP configuration. The standard LP relaxation is not tight enough in many real-world problems, however, and this has lead to the use of higher order cluster-based LP relaxations. The computational cost increases exponentially with the size of the clusters and limits the number and type of clusters we can use. We propose to solve the cluster selection problem monotonically in the dual LP, iteratively selecting clusters with guaranteed improvement, and quickly re-solving with the added clusters by reusing the existing solution. Our dual message-passing algorithm finds the MAP configuration in protein sidechain placement, protein design, and stereo problems, in cases where the standard LP relaxation fails.

Original languageEnglish (US)
Title of host publicationProceedings of the 24th Conference on Uncertainty in Artificial Intelligence, UAI 2008
Pages503-510
Number of pages8
StatePublished - 2008
Event24th Conference on Uncertainty in Artificial Intelligence, UAI 2008 - Helsinki, Finland
Duration: Jul 9 2008Jul 12 2008

Publication series

NameProceedings of the 24th Conference on Uncertainty in Artificial Intelligence, UAI 2008

Other

Other24th Conference on Uncertainty in Artificial Intelligence, UAI 2008
Country/TerritoryFinland
CityHelsinki
Period7/9/087/12/08

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Tightening LP relaxations for MAP using message passing'. Together they form a unique fingerprint.

Cite this