Modularized Bayesian analyses and cutting feedback in likelihood-free inference

03/18/2022
by   Atlanta Chakraborty, et al.
0

There has been much recent interest in modifying Bayesian inference for misspecified models so that it is useful for specific purposes. One popular modified Bayesian inference method is "cutting feedback" which can be used when the model consists of a number of coupled modules, with only some of the modules being misspecified. Cutting feedback methods represent the full posterior distribution in terms of conditional and sequential components, and then modify some terms in such a representation based on the modular structure for specification or computation of a modified posterior distribution. The main goal of this is to avoid contamination of inferences for parameters of interest by misspecified modules. Computation for cut posterior distributions is challenging, and here we consider cutting feedback for likelihood-free inference based on Gaussian mixture approximations to the joint distribution of parameters and data summary statistics. We exploit the fact that marginal and conditional distributions of a Gaussian mixture are Gaussian mixtures to give explicit approximations to marginal or conditional posterior distributions so that we can easily approximate cut posterior analyses. The mixture approach allows repeated approximation of posterior distributions for different data based on a single mixture fit, which is important for model checks which aid in the decision of whether to "cut". A semi-modular approach to likelihood-free inference where feedback is partially cut is also developed. The benefits of the method are illustrated in two challenging examples, a collective cell spreading model and a continuous time model for asset returns with jumps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2023

Guaranteed Accuracy of Semi-Modular Posteriors

Bayesian inference has widely acknowledged advantages in many problems, ...
research
02/21/2022

Cutting feedback and modularized analyses in generalized Bayesian inference

Even in relatively simple settings, model misspecification can make the ...
research
08/25/2021

Variational inference for cutting feedback in misspecified models

Bayesian analyses combine information represented by different terms in ...
research
04/01/2022

Scalable Semi-Modular Inference with Variational Meta-Posteriors

The Cut posterior and related Semi-Modular Inference are Generalised Bay...
research
10/21/2021

Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap

Bayesian inference provides a framework to combine an arbitrary number o...
research
03/15/2020

Semi-Modular Inference: enhanced learning in multi-modular models by tempering the influence of components

Bayesian statistical inference loses predictive optimality when generati...
research
04/09/2019

Meta-analysis of Bayesian analyses

Meta-analysis aims to combine results from multiple related statistical ...

Please sign up or login with your details

Forgot password? Click here to reset