A Neural Network MCMC sampler that maximizes Proposal Entropy

10/07/2020
by   Zengyi Li, et al.
13

Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods. Augmenting samplers with neural networks can potentially improve their efficiency. Previous neural network based samplers were trained with objectives that either did not explicitly encourage exploration, or used a L2 jump objective which could only be applied to well structured distributions. Thus it seems promising to instead maximize the proposal entropy for adapting the proposal to distributions of any shape. To allow direct optimization of the proposal entropy, we propose a neural network MCMC sampler that has a flexible and tractable proposal distribution. Specifically, our network architecture utilizes the gradient of the target distribution for generating proposals. Our model achieves significantly higher efficiency than previous neural network MCMC techniques in a variety of sampling tasks. Further, the sampler is applied on training of a convergent energy-based model of natural images. The adaptive sampler achieves unbiased sampling with significantly higher proposal entropy than Langevin dynamics sampler.

READ FULL TEXT

page 1

page 3

page 5

page 9

page 10

page 12

page 13

page 14

research
08/05/2021

The No-U-Turn Sampler as a Proposal Distribution in a Sequential Monte Carlo Sampler with a Near-Optimal L-Kernel

Markov Chain Monte Carlo (MCMC) is a powerful method for drawing samples...
research
09/08/2021

LSB: Local Self-Balancing MCMC in Discrete Spaces

Markov Chain Monte Carlo (MCMC) methods are promising solutions to sampl...
research
10/13/2021

Generating MCMC proposals by randomly rotating the regular simplex

We present the simplicial sampler, a class of parallel MCMC methods that...
research
06/08/2023

Entropy-based Training Methods for Scalable Neural Implicit Sampler

Efficiently sampling from un-normalized target distributions is a fundam...
research
12/10/2021

Sampling from Discrete Energy-Based Models with Quality/Efficiency Trade-offs

Energy-Based Models (EBMs) allow for extremely flexible specifications o...
research
10/25/2021

Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals

Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computat...
research
06/12/2022

Monte Carlo with Soft Constraints: the Surface Augmented Sampler

We describe an MCMC method for sampling distributions with soft constrai...

Please sign up or login with your details

Forgot password? Click here to reset