Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

06/15/2021
by   Lang Liu, et al.
0

The spectacular success of deep generative models calls for quantitative tools to measure their statistical performance. Divergence frontiers have recently been proposed as an evaluation framework for generative models, due to their ability to measure the quality-diversity trade-off inherent to deep generative modeling. However, the statistical behavior of divergence frontiers estimated from data remains unknown to this day. In this paper, we establish non-asymptotic bounds on the sample complexity of the plug-in estimator of divergence frontiers. Along the way, we introduce a novel integral summary of divergence frontiers. We derive the corresponding non-asymptotic bounds and discuss the choice of the quantization level by balancing the two types of approximation errors arisen from its computation. We also augment the divergence frontier framework by investigating the statistical performance of smoothed distribution estimators such as the Good-Turing estimator. We illustrate the theoretical results with numerical examples from natural language processing and computer vision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2018

Spread Divergences

For distributions p and q with different support, the divergence general...
research
06/14/2022

Towards Goal, Feasibility, and Diversity-Oriented Deep Generative Models in Design

Deep Generative Machine Learning Models (DGMs) have been growing in popu...
research
06/15/2022

Training Discrete Deep Generative Models via Gapped Straight-Through Estimator

While deep generative models have succeeded in image processing, natural...
research
03/12/2020

Statistical and Topological Properties of Sliced Probability Divergences

The idea of slicing divergences has been proven to be successful when co...
research
08/12/2021

DOI: Divergence-based Out-of-Distribution Indicators via Deep Generative Models

To ensure robust and reliable classification results, OoD (out-of-distri...
research
12/31/2021

Triangular Flows for Generative Modeling: Statistical Consistency, Smoothness Classes, and Fast Rates

Triangular flows, also known as Knöthe-Rosenblatt measure couplings, com...
research
06/08/2021

Manifold Topology Divergence: a Framework for Comparing Data Manifolds

We develop a framework for comparing data manifolds, aimed, in particula...

Please sign up or login with your details

Forgot password? Click here to reset