Rates of convergence for density estimation with GANs

01/30/2021
by   Denis Belomestny, et al.
0

We undertake a precise study of the non-asymptotic properties of vanilla generative adversarial networks (GANs) and derive theoretical guarantees in the problem of estimating an unknown d-dimensional density p^* under a proper choice of the class of generators and discriminators. We prove that the resulting density estimate converges to p^* in terms of Jensen-Shannon (JS) divergence at the rate (log n/n)^2β/(2β+d) where n is the sample size and β determines the smoothness of p^*. This is the first result in the literature on density estimation using vanilla GANs with JS rates faster than n^-1/2 in the regime β>d/2.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2018

Nonparametric Density Estimation under Adversarial Losses

We study minimax convergence rates of nonparametric density estimation u...
research
04/18/2020

Robust Density Estimation under Besov IPM Losses

We study minimax convergence rates of nonparametric density estimation i...
research
03/21/2018

Some Theoretical Properties of GANs

Generative Adversarial Networks (GANs) are a class of generative algorit...
research
11/24/2020

A Convenient Infinite Dimensional Framework for Generative Adversarial Learning

In recent years, generative adversarial networks (GANs) have demonstrate...
research
04/30/2021

TREND: Truncated Generalized Normal Density Estimation of Inception Embeddings for Accurate GAN Evaluation

Evaluating image generation models such as generative adversarial networ...
research
09/19/2017

Summable Reparameterizations of Wasserstein Critics in the One-Dimensional Setting

Generative adversarial networks (GANs) are an exciting alternative to al...
research
12/18/2020

On the density estimation problem for uncertainty propagation with unknown input distributions

In this article we study the problem of quantifying the uncertainty in a...

Please sign up or login with your details

Forgot password? Click here to reset