Normalizing flow sampling with Langevin dynamics in the latent space

05/20/2023
by   Florentin Coeurdoux, et al.
0

Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set. Once trained by minimizing a variational objective, the learnt map provides an approximate generative model of the target distribution. Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions. For instance, such problems may appear for distributions on multi-component topologies or characterized by multiple modes with high probability regions separated by very unlikely areas. A typical symptom is the explosion of the Jacobian norm of the transformation in very low probability areas. This paper proposes to overcome this issue thanks to a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain. The approach relies on a Metropolis adjusted Langevin algorithm (MALA) whose dynamics explicitly exploits the Jacobian of the transformation. Contrary to alternative approaches, the proposed strategy preserves the tractability of the likelihood and it does not require a specific training. Notably, it can be straightforwardly used with any pre-trained NF network, regardless of the architecture. Experiments conducted on synthetic and high-dimensional real data sets illustrate the efficiency of the method.

READ FULL TEXT

page 2

page 10

page 11

page 16

research
02/04/2021

Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC

Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sam...
research
01/12/2018

Deep Learning for Sampling from Arbitrary Probability Distributions

This paper proposes a fully connected neural network model to map sample...
research
10/13/2019

Deep Markov Chain Monte Carlo

We propose a new computationally efficient sampling scheme for Bayesian ...
research
06/15/2020

Learning Latent Space Energy-Based Prior Model

The generator model assumes that the observed example is generated by a ...
research
10/01/2020

VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models

Energy-based models (EBMs) have recently been successful in representing...
research
02/09/2023

On Sampling with Approximate Transport Maps

Transport maps can ease the sampling of distributions with non-trivial g...
research
12/31/2021

Machine Learning Trivializing Maps: A First Step Towards Understanding How Flow-Based Samplers Scale Up

A trivializing map is a field transformation whose Jacobian determinant ...

Please sign up or login with your details

Forgot password? Click here to reset