Rapid Mixing of Hamiltonian Monte Carlo on Strongly Log-Concave Distributions

08/23/2017
by   Oren Mangoubi, et al.
0

We obtain several quantitative bounds on the mixing properties of the Hamiltonian Monte Carlo (HMC) algorithm for a strongly log-concave target distribution π on R^d, showing that HMC mixes quickly in this setting. One of our main results is a dimension-free bound on the mixing of an "ideal" HMC chain, which is used to show that the usual leapfrog implementation of HMC can sample from π using only O(d^1/4) gradient evaluations. This dependence on dimension is sharp, and our results significantly extend and improve previous quantitative bounds on the mixing of HMC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2021

Mixing Time Guarantees for Unadjusted Hamiltonian Monte Carlo

We provide quantitative upper bounds on the total variation mixing time ...
research
11/25/2017

Generalizing Hamiltonian Monte Carlo with Neural Networks

We present a general-purpose method to train Markov chain Monte Carlo ke...
research
02/26/2022

Metropolis Adjusted Langevin Trajectories: a robust alternative to Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) is a widely used sampler, known for its ef...
research
02/10/2020

Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo

We show that the gradient norm ∇ f(x) for x ∼(-f(x)), where f is strongl...
research
04/04/2022

A PRticle filter algorithm for nonparametric estimation of multivariate mixing distributions

Predictive recursion (PR) is a fast, recursive algorithm that gives a sm...
research
05/30/2019

Langevin Monte Carlo without Smoothness

Langevin Monte Carlo (LMC) is an iterative algorithm used to generate sa...
research
02/24/2018

Dimensionally Tight Running Time Bounds for Second-Order Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from...

Please sign up or login with your details

Forgot password? Click here to reset