Convergence Analysis of a Collapsed Gibbs Sampler for Bayesian Vector Autoregressions
We propose a collapsed Gibbs sampler for Bayesian vector autoregressions with predictors, or exogenous variables, and study the proposed algorithm's convergence properties. The Markov chain generated by our algorithm converges to its stationary distribution at least as fast as those of competing (non-collapsed) Gibbs samplers and is shown to be geometrically ergodic regardless of whether the number of observations in the underlying vector autoregression is small or large in comparison to the order and dimension of it. We also give conditions for when the geometric ergodicity is asymptotically stable as the number of observations tends to infinity. Specifically, the geometric convergence rate is shown to be bounded away from unity asymptotically, either almost surely or with probability tending to one, depending on what is assumed about the data generating process. Our results are among the first of their kind for practically relevant Markov chain Monte Carlo algorithms.
READ FULL TEXT