Erwin Schrödinger posed—and to a large extent solved—in 1931/32 the problem of finding the most likely random evolution between two continuous probability distributions. This article considers this problem in the case when only samples of the two distributions are available. A novel iterative procedure is proposed, inspired by Fortet-IPF-Sinkhorn type algorithms. Since only samples of the marginals are available, the new approach features constrained maximum likelihood estimation in place of the nonlinear boundary couplings, and importance sampling to propagate the functions ϕ and (Formula presented.) solving the Schrödinger system. This method mitigates the curse of dimensionality, compared to the introduction of grids, which in high dimensions lead to numerically unfeasible methods. The methodology is illustrated in two applications: entropic interpolation of two-dimensional Gaussian mixtures, and the estimation of integrals through a variation of importance sampling.
ASJC Scopus subject areas
- Applied Mathematics