Score diffusion models without early stopping: finite Fisher information is all you need

08/23/2023
by   Giovanni Conforti, et al.
0

Diffusion models are a new class of generative models that revolve around the estimation of the score function associated with a stochastic differential equation. Subsequent to its acquisition, the approximated score function is then harnessed to simulate the corresponding time-reversal process, ultimately enabling the generation of approximate data samples. Despite their evident practical significance these models carry, a notable challenge persists in the form of a lack of comprehensive quantitative results, especially in scenarios involving non-regular scores and estimators. In almost all reported bounds in Kullback Leibler (KL) divergence, it is assumed that either the score function or its approximation is Lipschitz uniformly in time. However, this condition is very restrictive in practice or appears to be difficult to establish. To circumvent this issue, previous works mainly focused on establishing convergence bounds in KL for an early stopped version of the diffusion model and a smoothed version of the data distribution, or assuming that the data distribution is supported on a compact manifold. These explorations have lead to interesting bounds in either Wasserstein or Fortet-Mourier metrics. However, the question remains about the relevance of such early-stopping procedure or compactness conditions. In particular, if there exist a natural and mild condition ensuring explicit and sharp convergence bounds in KL. In this article, we tackle the aforementioned limitations by focusing on score diffusion models with fixed step size stemming from the Ornstein-Ulhenbeck semigroup and its kinetic counterpart. Our study provides a rigorous analysis, yielding simple, improved and sharp convergence bounds in KL applicable to any data distribution with finite Fisher information with respect to the standard Gaussian distribution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2022

Improved Analysis of Score-based Generative Modeling: User-Friendly Bounds under Minimal Smoothness Assumptions

In this paper, we focus on the theoretical analysis of diffusion-based g...
research
08/07/2023

Linear Convergence Bounds for Diffusion Models via Stochastic Localization

Diffusion models are a powerful method for generating approximate sample...
research
08/10/2022

Convergence of denoising diffusion models under the manifold hypothesis

Denoising diffusion models are a recent class of generative models exhib...
research
02/14/2023

Score Approximation, Estimation and Distribution Recovery of Diffusion Models on Low-Dimensional Data

Diffusion models achieve state-of-the-art performance in various generat...
research
07/19/2022

A sharp uniform-in-time error estimate for Stochastic Gradient Langevin Dynamics

We establish a sharp uniform-in-time error estimate for the Stochastic G...
research
05/23/2023

Improved Convergence of Score-Based Diffusion Models via Prediction-Correction

Score-based generative models (SGMs) are powerful tools to sample from c...
research
02/21/2023

On Calibrating Diffusion Probabilistic Models

Recently, diffusion probabilistic models (DPMs) have achieved promising ...

Please sign up or login with your details

Forgot password? Click here to reset