Sampling with flows, diffusion and autoregressive neural networks: A spin-glass perspective

08/27/2023
by   Davide Ghio, et al.
0

Recent years witnessed the development of powerful generative models based on flows, diffusion or autoregressive neural networks, achieving remarkable success in generating data from examples with applications in a broad range of areas. A theoretical analysis of the performance and understanding of the limitations of these methods remain, however, challenging. In this paper, we undertake a step in this direction by analysing the efficiency of sampling by these methods on a class of problems with a known probability distribution and comparing it with the sampling performance of more traditional methods such as the Monte Carlo Markov chain and Langevin dynamics. We focus on a class of probability distribution widely studied in the statistical physics of disordered systems that relate to spin glasses, statistical inference and constraint satisfaction problems. We leverage the fact that sampling via flow-based, diffusion-based or autoregressive networks methods can be equivalently mapped to the analysis of a Bayes optimal denoising of a modified probability measure. Our findings demonstrate that these methods encounter difficulties in sampling stemming from the presence of a first-order phase transition along the algorithm's denoising path. Our conclusions go both ways: we identify regions of parameters where these methods are unable to sample efficiently, while that is possible using standard Monte Carlo or Langevin approaches. We also identify regions where the opposite happens: standard approaches are inefficient while the discussed generative methods work well.

READ FULL TEXT

page 9

page 17

page 22

page 25

page 28

research
02/27/2023

Denoising Diffusion Samplers

Denoising diffusion models are a popular class of generative models prov...
research
07/04/2023

Generative Flow Networks: a Markov Chain Perspective

While Markov chain Monte Carlo methods (MCMC) provide a general framewor...
research
12/18/2020

Generative Neural Samplers for the Quantum Heisenberg Chain

Generative neural samplers offer a complementary approach to Monte Carlo...
research
02/10/2021

Argmax Flows and Multinomial Diffusion: Towards Non-Autoregressive Language Models

The field of language modelling has been largely dominated by autoregres...
research
05/12/2021

Unbiased Monte Carlo Cluster Updates with Autoregressive Neural Networks

Efficient sampling of complex high-dimensional probability densities is ...
research
02/18/2020

Gravitational-wave parameter estimation with autoregressive neural network flows

We introduce the use of autoregressive normalizing flows for rapid likel...
research
04/19/2022

On the Dynamics of Inference and Learning

Statistical Inference is the process of determining a probability distri...

Please sign up or login with your details

Forgot password? Click here to reset