Deep importance sampling using tensor trains with application to a priori and a posteriori rare events

Constraints are a natural choice for prior information in Bayesian inference. In various applications, the parameters of interest lie on the boundary of the constraint set. In this paper, we use a method that implicitly defines a constrained prior such that the posterior assigns positive probability...

Full description

Saved in:
Bibliographic Details
Main Authors: Cui, Tiangang (Author) , Dolgov, Sergey (Author) , Scheichl, Robert (Author)
Format: Article (Journal)
Language:English
Published: Feb 2024
In: SIAM journal on scientific computing
Year: 2024, Volume: 46, Issue: 1, Pages: C1-C29
ISSN:1095-7197
DOI:10.1137/23M1546981
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1137/23M1546981
Verlag, lizenzpflichtig, Volltext: https://epubs.siam.org/doi/10.1137/23M1546981
Get full text
Author Notes:Tiangang Cui, Sergey Dolgov, Robert Scheichl
Description
Summary:Constraints are a natural choice for prior information in Bayesian inference. In various applications, the parameters of interest lie on the boundary of the constraint set. In this paper, we use a method that implicitly defines a constrained prior such that the posterior assigns positive probability to the boundary of the constraint set. We show that by projecting posterior mass onto a polyhedral constraint set, we obtain a new posterior with a rich probabilistic structure on the boundary of that set. If the original posterior is a Gaussian, then such a projection can be done efficiently. We apply the method to Bayesian linear inverse problems, in which case samples can be obtained by repeatedly solving constrained least squares problems, similar to an MAP estimate but with perturbations in the data. When combined into a Bayesian hierarchical model and the constraint set is a polyhedral cone, we can derive a Gibbs sampler to efficiently sample from the hierarchical model. To show the effect of projecting the posterior, we applied the method to deblurring and CT examples.
Item Description:Gesehen am 08.01.2025
Physical Description:Online Resource
ISSN:1095-7197
DOI:10.1137/23M1546981