Vector Approximate Message Passing

Sundeep Rangan, Philip Schniter, Alyson K. Fletcher

Research output: Contribution to journalArticlepeer-review


The standard linear regression (SLR) problem is to recover a vector \mathrm {x}^{0} from noisy linear observations \mathrm {y}=\mathrm {Ax}^{0}+\mathrm {w}. The approximate message passing (AMP) algorithm proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its per-iteration behavior is rigorously characterized by a scalar state-evolution whose fixed points, when unique, are Bayes optimal. The AMP algorithm, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-orthogonally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error derived by Tulino, Caire, Verdú, and Shamai. Numerical experiments are used to confirm the effectiveness of VAMP and its consistency with state-evolution predictions.

Original languageEnglish (US)
Article number8713501
Pages (from-to)6664-6684
Number of pages21
JournalIEEE Transactions on Information Theory
Issue number10
StatePublished - Oct 2019


  • Belief propagation
  • compressive sensing
  • inference algorithms
  • message passing
  • random matrices

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Vector Approximate Message Passing'. Together they form a unique fingerprint.

Cite this