A deterministic and computable Bernstein-von Mises theorem

04/04/2019
by   Guillaume P. Dehaene, et al.
0

Bernstein-von Mises results (BvM) establish that the Laplace approximation is asymptotically correct in the large-data limit. However, these results are inappropriate for computational purposes since they only hold over most, and not all, datasets and involve hard-to-estimate constants. In this article, I present a new BvM theorem which bounds the Kullback-Leibler (KL) divergence between a fixed log-concave density f(θ) and its Laplace approximation. The bound goes to 0 as the higher-derivatives of f(θ) tend to 0 and f(θ) becomes increasingly Gaussian. The classical BvM theorem in the IID large-data asymptote is recovered as a corollary. Critically, this theorem further suggests a number of computable approximations of the KL divergence with the most promising being: KL(g_LAP,f)≈1/2Var_θ∼ g(θ)([f(θ)]-[g_LAP(θ)]) An empirical investigation of these bounds in the logistic classification model reveals that these approximations are great surrogates for the KL divergence. This result, and future results of a similar nature, could provide a path towards rigorously controlling the error due to the Laplace approximation and more modern approximation methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset