Dimensionally Tight Bounds for Second-Order Hamiltonian Monte Carlo

02/24/2018
by   Oren Mangoubi, et al.
0

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from high-dimensional distributions in Statistics and Machine learning. HMC is known to run very efficiently in practice and its popular second-order "leapfrog" implementation has long been conjectured to run in d^1/4 steps. Here we show that this conjecture is true when sampling from strongly log-concave target distributions that satisfy a weak third-order regularity property associated with the input data. Our regularity condition is weaker than the Lipschitz Hessian property and allows us to show faster running time bounds for a much larger class of distributions than would be possible with the usual Lipschitz Hessian constant alone. Important distributions that satisfy our regularity condition include posterior distributions used in Bayesian logistic regression for which the data satisfies an "incoherence" property. Our result compares favorably with the best available running time bounds for the class of strongly log-concave distributions, which grow like d^1/2 with the dimension. Moreover, our simulations on synthetic data suggest that, when our regularity condition is satisfied, the leapfrog HMC performs better than its competitors -- both in terms of accuracy and running time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2018

Dimensionally Tight Running Time Bounds for Second-Order Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from...
research
02/22/2019

Nonconvex sampling with the Metropolis-adjusted Langevin algorithm

The Langevin Markov chain algorithms are widely deployed methods to samp...
research
03/28/2023

Hessian-informed Hamiltonian Monte Carlo for high-dimensional problems

We investigate the effect of using local and non-local second derivative...
research
02/11/2020

Wasserstein Control of Mirror Langevin Monte Carlo

Discretized Langevin diffusions are efficient Monte Carlo methods for sa...
research
08/22/2023

Nonlinear Hamiltonian Monte Carlo its Particle Approximation

We present a nonlinear (in the sense of McKean) generalization of Hamilt...
research
02/09/2021

A New Framework for Variance-Reduced Hamiltonian Monte Carlo

We propose a new framework of variance-reduced Hamiltonian Monte Carlo (...
research
03/25/2021

About the regularity of the discriminator in conditional WGANs

Training of conditional WGANs is usually done by averaging the underlyin...

Please sign up or login with your details

Forgot password? Click here to reset