On distinguishability criteria for estimating generative models

by   Ian J. Goodfellow, et al.

Two recently introduced criteria for estimation of generative models are both based on a reduction to binary classification. Noise-contrastive estimation (NCE) is an estimation procedure in which a generative model is trained to be able to distinguish data samples from noise samples. Generative adversarial networks (GANs) are pairs of generator and discriminator networks, with the generator network learning to generate samples by attempting to fool the discriminator network into believing its samples are real data. Both estimation procedures use the same function to drive learning, which naturally raises questions about how they are related to each other, as well as whether this function is related to maximum likelihood estimation (MLE). NCE corresponds to training an internal data model belonging to the discriminator network but using a fixed generator network. We show that a variant of NCE, with a dynamic generator network, is equivalent to maximum likelihood estimation. Since pairing a learned discriminator with an appropriate dynamically selected generator recovers MLE, one might expect the reverse to hold for pairing a learned generator with a certain discriminator. However, we show that recovering MLE for a learned generator requires departing from the distinguishability game. Specifically: (i) The expected gradient of the NCE discriminator can be made to match the expected gradient of MLE, if one is allowed to use a non-stationary noise distribution for NCE, (ii) No choice of discriminator network can make the expected gradient for the GAN generator match that of MLE, and (iii) The existing theory does not guarantee that GANs will converge in the non-convex case. This suggests that the key next step in GAN research is to determine whether GANs converge, and if not, to modify their training algorithm to force convergence.


page 1

page 2

page 3

page 4


ARAML: A Stable Adversarial Training Framework for Text Generation

Most of the existing generative adversarial networks (GAN) for text gene...

A Discriminator Improves Unconditional Text Generation without Updating the Generator

We propose a novel mechanism to improve a text generator with a discrimi...

EBMs Trained with Maximum Likelihood are Generator Models Trained with a Self-adverserial Loss

Maximum likelihood estimation is widely used in training Energy-based mo...

Online Adaptative Curriculum Learning for GANs

Generative Adversarial Networks (GANs) can successfully learn a probabil...

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

Generative adversarial networks (GAN) are a widely used class of deep ge...

Sliced Iterative Generator

We introduce the Sliced Iterative Generator (SIG), an iterative generati...

Code Repositories


Image super-resolution through deep learning

view repo

Please sign up or login with your details

Forgot password? Click here to reset