KALE: When Energy-Based Learning Meets Adversarial Training

03/10/2020
by   Michael Arbel, et al.
13

Legendre duality provides a variational lower-bound for the Kullback-Leibler divergence (KL) which can be estimated using samples, without explicit knowledge of the density ratio. We use this estimator, the KL Approximate Lower-bound Estimate (KALE), in a contrastive setting for learning energy-based models, and show that it provides a maximum likelihood estimate (MLE). We then extend this procedure to adversarial training, where the discriminator represents the energy and the generator is the base measure of the energy-based model. Unlike in standard generative adversarial networks (GANs), the learned model makes use of both generator and discriminator to generate samples. This is achieved using Hamiltonian Monte Carlo in the latent space of the generator, using information from the discriminator, to find regions in that space that produce better quality samples. We also show that, unlike the KL, KALE enjoys smoothness properties that make it suitable for adversarial training, and provide convergence rates for KALE when the negative log density ratio belongs to the variational family. Finally, we demonstrate the effectiveness of this approach on simple datasets.

READ FULL TEXT
research
04/05/2020

Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling by Exploring Energy of the Discriminator

Generative Adversarial Networks (GANs) have shown great promise in model...
research
01/24/2019

Maximum Entropy Generators for Energy-Based Models

Unsupervised learning is about capturing dependencies between variables ...
research
03/06/2020

Training Deep Energy-Based Models with f-Divergence Minimization

Deep energy-based models (EBMs) are very flexible in distribution parame...
research
02/20/2020

The Benefits of Pairwise Discriminators for Adversarial Training

Adversarial training methods typically align distributions by solving tw...
research
11/01/2021

Bounds all around: training energy-based models with bidirectional bounds

Energy-based models (EBMs) provide an elegant framework for density esti...
research
10/31/2019

Generalizing Energy-based Generative ConvNets from Particle Evolution Perspective

Compared with Generative Adversarial Networks (GAN), the Energy-Based ge...
research
09/12/2018

The Inductive Bias of Restricted f-GANs

Generative adversarial networks are a novel method for statistical infer...

Please sign up or login with your details

Forgot password? Click here to reset