TY - GEN
T1 - Vector approximate message passing for the generalized linear model
AU - Schniter, Philip
AU - Rangan, Sundeep
AU - Fletcher, Alyson K.
N1 - Funding Information:
Schniter acknowledges support from NSF grant 1527162; Rangan from NSF grants 1302336, 1564142, and 1547332; and Fletcher from NSF grants 1254204 and 1564278 as well as ONR grant N00014-15-1-2677
Publisher Copyright:
© 2016 IEEE.
PY - 2017/3/1
Y1 - 2017/3/1
N2 - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.
AB - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.
UR - http://www.scopus.com/inward/record.url?scp=85016260654&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85016260654&partnerID=8YFLogxK
U2 - 10.1109/ACSSC.2016.7869633
DO - 10.1109/ACSSC.2016.7869633
M3 - Conference contribution
AN - SCOPUS:85016260654
T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers
SP - 1525
EP - 1529
BT - Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
A2 - Matthews, Michael B.
PB - IEEE Computer Society
T2 - 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
Y2 - 6 November 2016 through 9 November 2016
ER -