SENSITIVITY ANALYSIS OF THE INFORMATION GAIN IN INFINITE-DIMENSIONAL BAYESIAN LINEAR INVERSE PROBLEMS

Abhijit Chowdhary, Shanyin Tong, Georg Stadler, Alen Alexanderian

Research output: Contribution to journalArticlepeer-review

Abstract

We study the sensitivity of infinite-dimensional Bayesian linear inverse problems governed by partial differential equations (PDEs) with respect to modeling uncertainties. In particular, we consider derivative-based sensitivity analysis of the information gain as measured by the Kullback–Leibler divergence from the posterior to the prior distribution. To facilitate this we develop a fast and accurate method for computing derivatives of the information gain with respect to auxiliary model parameters. Our approach combines low-rank approximations, adjoint-based eigenvalue sensitivity analysis, and postoptimal sensitivity analysis. The proposed approach also paves the way for global sensitivity analysis by computing derivative-based global sensitivity measures. We illustrate different aspects of the proposed approach using an inverse problem governed by a scalar linear elliptic PDE, and an inverse problem governed by the three-dimensional equations of linear elasticity, which is motivated by the inversion of the fault-slip field after an earthquake.

Original languageEnglish (US)
Pages (from-to)17-35
Number of pages19
JournalInternational Journal for Uncertainty Quantification
Volume14
Issue number6
DOIs
StatePublished - 2024

Keywords

  • Bayesian inverse problems
  • Kullback-Leibler divergence
  • adjoint-based gradient computation
  • numerical PDE
  • uncertainty quantification

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Discrete Mathematics and Combinatorics
  • Control and Optimization

Fingerprint

Dive into the research topics of 'SENSITIVITY ANALYSIS OF THE INFORMATION GAIN IN INFINITE-DIMENSIONAL BAYESIAN LINEAR INVERSE PROBLEMS'. Together they form a unique fingerprint.

Cite this