An Eulerian approach to the regularized JKO scheme with low-rank tensor decompositions for Bayesian inversion

Loading...
Thumbnail Image

Date

Editor

Advisor

Volume

3143

Issue

Journal

Series Titel

WIAS Preprints

Book Title

Publisher

Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik

Supplementary Material

Abstract

The possibility of using the Eulerian discretization for the problem of modelling high dimensional distributions and sampling, is studied. The problem is posed as a minimization problem over the space of probability measures with respect to the Wasserstein distance and solved with the entropy-regularized JKO scheme. Each proximal step can be formulated as a fixed-point equation and solved with accelerated methods, such as Anderson's. The usage of the low-rank Tensor Train format allows to overcome the curse of dimensionality, i.e. the exponential growth of degrees of freedom with dimension, inherent to Eulerian approaches. The resulting method requires only pointwise computations of the unnormalized posterior and is, in particular, gradient-free. Fixed Eulerian grid allows to employ a caching strategy, significally reducing the expensive evaluations of the posterior. When the Eulerian model of the target distribution is fitted, the passage back to the Lagrangian perspective can also be made, allowing to approximately sample from the distribution. We test our method both for synthetic target distributions and particular Bayesian inverse problems and report comparable or better performance than the baseline Metropolis-Hastings MCMC with the same amount of resources. Finally, the fitted model can be modified to facilitate the solution of certain associated problems, which we demonstrate by fitting an importance distribution for a particular quantity of interest. We release our code at https://github.com/viviaxenov/rJKOtt.

Description

Keywords GND

Conference

Publication Type

Report

Version

publishedVersion

License

CC BY 4.0 Unported