TY - GEN

T1 - Fixed points of generalized approximate message passing with arbitrary matrices

AU - Rangan, Sundeep

AU - Schniter, Philip

AU - Riegler, Erwin

AU - Fletcher, Alyson

AU - Cevher, Volkan

PY - 2013

Y1 - 2013

N2 - The estimation of a random vector with independent components passed through a linear transform followed by a componentwise (possibly nonlinear) output map arises in a range of applications. Approximate message passing (AMP) methods, based on Gaussian approximations of loopy belief propagation, have recently attracted considerable attention for such problems. For large random transforms, these methods exhibit fast convergence and admit precise analytic characterizations with testable conditions for optimality, even for certain non-convex problem instances. However, the behavior of AMP under general transforms is not fully understood. In this paper, we consider the generalized AMP (GAMP) algorithm and relate the method to more common optimization techniques. This analysis enables a precise characterization of the GAMP algorithm fixed-points that applies to arbitrary transforms. In particular, we show that the fixed points of the so-called max-sum GAMP algorithm for MAP estimation are critical points of a constrained maximization of the posterior density. The fixed-points of the sum-product GAMP algorithm for estimation of the posterior marginals can be interpreted as critical points of a certain mean-field variational optimization.

AB - The estimation of a random vector with independent components passed through a linear transform followed by a componentwise (possibly nonlinear) output map arises in a range of applications. Approximate message passing (AMP) methods, based on Gaussian approximations of loopy belief propagation, have recently attracted considerable attention for such problems. For large random transforms, these methods exhibit fast convergence and admit precise analytic characterizations with testable conditions for optimality, even for certain non-convex problem instances. However, the behavior of AMP under general transforms is not fully understood. In this paper, we consider the generalized AMP (GAMP) algorithm and relate the method to more common optimization techniques. This analysis enables a precise characterization of the GAMP algorithm fixed-points that applies to arbitrary transforms. In particular, we show that the fixed points of the so-called max-sum GAMP algorithm for MAP estimation are critical points of a constrained maximization of the posterior density. The fixed-points of the sum-product GAMP algorithm for estimation of the posterior marginals can be interpreted as critical points of a certain mean-field variational optimization.

KW - ADMM

KW - Belief propagation

KW - message passing

KW - variational optimization

UR - http://www.scopus.com/inward/record.url?scp=84890407253&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84890407253&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2013.6620309

DO - 10.1109/ISIT.2013.6620309

M3 - Conference contribution

AN - SCOPUS:84890407253

SN - 9781479904464

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 664

EP - 668

BT - 2013 IEEE International Symposium on Information Theory, ISIT 2013

T2 - 2013 IEEE International Symposium on Information Theory, ISIT 2013

Y2 - 7 July 2013 through 12 July 2013

ER -