TY - JOUR
T1 - A computational framework for infinite-dimensional Bayesian inverse problems, part II
T2 - Stochastic Newton mcmc with application to ice sheet flow inverse problems
AU - Petra, Noemi
AU - Martin, James
AU - Stadler, Georg
AU - Ghattas, Omar
N1 - Publisher Copyright:
© 2014 Society for Industrial and Applied Mathematics.
PY - 2014
Y1 - 2014
N2 - We address the numerical solution of infinite-dimensional inverse problems in the framework of Bayesian inference. In Part I of this paper [T. Bui-Thanh, O. Ghattas, J. Martin, and G. Stadler, SIAM J. Sci. Comput., 35 (2013), pp. A2494-A2523] we considered the linearized infinitedimensional inverse problem. In Part II, we relax the linearization assumption and consider the fully nonlinear infinite-dimensional inverse problem using a Markov chain Monte Carlo (MCMC) sampling method. To address the challenges of sampling high-dimensional probability density functions (pdfs) arising upon discretization of Bayesian inverse problems governed by PDEs, we build upon the stochastic Newton MCMC method. This method exploits problem structure by taking as a proposal density a local Gaussian approximation of the posterior pdf, whose covariance operator is given by the inverse of the local Hessian of the negative log posterior pdf. The construction of the covariance is made tractable by invoking a low-rank approximation of the data misfit component of the Hessian. Here we introduce an approximation of the stochastic Newton proposal in which we compute the low-rank-based Hessian at just the maximum a posteriori (MAP) point, and then reuse this Hessian at each MCMC step. We compare the performance of the proposed method to the original stochastic Newton MCMC method and to an independence sampler. The comparison of the three methods is conducted on a synthetic ice sheet inverse problem. For this problem, the stochastic Newton MCMC method with a MAP-based Hessian converges at least as rapidly as the original stochastic Newton MCMC method, but is far cheaper since it avoids recomputing the Hessian at each step. On the other hand, it is more expensive per sample than the independence sampler; however, its convergence is significantly more rapid, and thus overall it is much cheaper. Finally, we present extensive analysis and interpretation of the posterior distribution and classify directions in parameter space based on the extent to which they are informed by the prior or the observations.
AB - We address the numerical solution of infinite-dimensional inverse problems in the framework of Bayesian inference. In Part I of this paper [T. Bui-Thanh, O. Ghattas, J. Martin, and G. Stadler, SIAM J. Sci. Comput., 35 (2013), pp. A2494-A2523] we considered the linearized infinitedimensional inverse problem. In Part II, we relax the linearization assumption and consider the fully nonlinear infinite-dimensional inverse problem using a Markov chain Monte Carlo (MCMC) sampling method. To address the challenges of sampling high-dimensional probability density functions (pdfs) arising upon discretization of Bayesian inverse problems governed by PDEs, we build upon the stochastic Newton MCMC method. This method exploits problem structure by taking as a proposal density a local Gaussian approximation of the posterior pdf, whose covariance operator is given by the inverse of the local Hessian of the negative log posterior pdf. The construction of the covariance is made tractable by invoking a low-rank approximation of the data misfit component of the Hessian. Here we introduce an approximation of the stochastic Newton proposal in which we compute the low-rank-based Hessian at just the maximum a posteriori (MAP) point, and then reuse this Hessian at each MCMC step. We compare the performance of the proposed method to the original stochastic Newton MCMC method and to an independence sampler. The comparison of the three methods is conducted on a synthetic ice sheet inverse problem. For this problem, the stochastic Newton MCMC method with a MAP-based Hessian converges at least as rapidly as the original stochastic Newton MCMC method, but is far cheaper since it avoids recomputing the Hessian at each step. On the other hand, it is more expensive per sample than the independence sampler; however, its convergence is significantly more rapid, and thus overall it is much cheaper. Finally, we present extensive analysis and interpretation of the posterior distribution and classify directions in parameter space based on the extent to which they are informed by the prior or the observations.
KW - Bayesian inference
KW - Ice sheet dynamics
KW - Infinite-dimensional inverse problems
KW - Low-rank approximation
KW - MCMC
KW - Stochastic Newton
KW - Uncertainty quantification
UR - http://www.scopus.com/inward/record.url?scp=84987755534&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84987755534&partnerID=8YFLogxK
U2 - 10.1137/130934805
DO - 10.1137/130934805
M3 - Article
AN - SCOPUS:84987755534
SN - 1064-8275
VL - 36
SP - A1525-A1555
JO - SIAM Journal on Scientific Computing
JF - SIAM Journal on Scientific Computing
IS - 4
ER -