Bayesian Flow Networks

08/14/2023
by   Alex Graves, et al.
0

This paper introduces Bayesian Flow Networks (BFNs), a new class of generative model in which the parameters of a set of independent distributions are modified with Bayesian inference in the light of noisy data samples, then passed as input to a neural network that outputs a second, interdependent distribution. Starting from a simple prior and iteratively updating the two distributions yields a generative procedure similar to the reverse process of diffusion models; however it is conceptually simpler in that no forward process is required. Discrete and continuous-time loss functions are derived for continuous, discretised and discrete data, along with sample generation procedures. Notably, the network inputs for discrete data lie on the probability simplex, and are therefore natively differentiable, paving the way for gradient-based sample guidance and few-step generation in discrete domains such as language modelling. The loss function directly optimises data compression and places no restrictions on the network architecture. In our experiments BFNs achieve competitive log-likelihoods for image modelling on dynamically binarized MNIST and CIFAR-10, and outperform all known discrete diffusion models on the text8 character-level language modelling task.

READ FULL TEXT

page 15

page 33

page 34

page 35

page 40

page 41

research
05/18/2023

Blackout Diffusion: Generative Diffusion Models in Discrete-State Spaces

Typical generative diffusion models rely on a Gaussian diffusion process...
research
05/30/2022

A Continuous Time Framework for Discrete Denoising Models

We provide the first complete continuous time framework for denoising di...
research
11/28/2022

Continuous diffusion for categorical data

Diffusion models have quickly become the go-to paradigm for generative m...
research
09/05/2023

Diffusion on the Probability Simplex

Diffusion models learn to reverse the progressive noising of a data dist...
research
04/10/2023

A Cheaper and Better Diffusion Language Model with Soft-Masked Noise

Diffusion models that are based on iterative denoising have been recentl...
research
11/28/2022

Refining Generative Process with Discriminator Guidance in Score-based Diffusion Models

While the success of diffusion models has been witnessed in various doma...
research
02/06/2017

Living a discrete life in a continuous world: Reference with distributed representations

Reference is a crucial property of language that allows us to connect li...

Please sign up or login with your details

Forgot password? Click here to reset