Regularization of Bayesian shrinkage priors and inference via geometrically / uniformly ergodic Gibbs sampler

11/06/2019
by   Akihiko Nishimura, et al.
0

Use of continuous shrinkage priors — with a "spike" near zero and heavy-tails towards infinity — is an increasingly popular approach to induce sparsity in parameter estimates. When the parameters are only weakly identified by the likelihood, however, the posterior may end up with tails as heavy as the prior, jeopardizing robustness of inference. A natural solution is to regularize or lighten up the tails of a shrinkage prior beyond a reasonable parameter range. Existing regularization strategies undermine the attractive computational properties of shrinkage priors. On the other hand, our alternative formulation achieves regularization while preserving the essential aspects of the original shrinkage priors. We study theoretical properties of the Gibbs sampler on resulting posterior distributions, with emphasis on convergence rates of the Pólya-Gamma Gibbs sampler for sparse logistic regression. Our analysis shows that the proposed regularization leads to geometric ergodicity under a broad range of global-local shrinkage priors. Essentially, the only requirement is for the the prior π_ local(·) on the local scale λ to satisfy π_ local(0) < ∞. In the case where lim_λ→ 0π_ local(λ) / λ^a < ∞ for a > 0 as in Bayesian bridge priors, we show the sampler to be uniformly ergodic.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset