On the Anatomy of MCMC-based Maximum Likelihood Learning of Energy-Based Models

03/29/2019
by   Erik Nijkamp, et al.
17

This study investigates the effects Markov Chain Monte Carlo (MCMC) sampling in unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family unnormalized probability densities for which the negative log density (or energy function) is a ConvNet. In general, we find that the majority of techniques used to stabilize training in previous studies can the opposite effect. Stable ML learning with a ConvNet potential can be achieved with only a few hyper-parameters and no regularization. With this minimal framework, we identify a variety of ML learning outcomes depending on the implementation of MCMC sampling. On one hand, we show that it is easy to train an energy-based model which can sample realistic images with short-run Langevin. ML can be effective and stable even when MCMC samples have much higher energy than true steady-state samples throughout training. Based on this insight, we introduce an ML method with noise initialization for MCMC, high-quality short-run synthesis, and the same budget as ML with informative MCMC initialization such as CD or PCD. Unlike previous models, this model can obtain realistic high-diversity samples from a noise signal after training with no auxiliary models. On the other hand, models learned with highly non-convergent MCMC do not have a valid steady-state and cannot be considered approximate unnormalized densities of the training data because long-run MCMC samples differ greatly from the data. We show that it is much harder to train an energy-based model where long-run and steady-state MCMC samples have realistic appearance. To our knowledge, long-run MCMC samples of all previous models result in unrealistic images. With correct tuning of Langevin noise, we train the first models for which long-run and steady-state MCMC samples are realistic images.

READ FULL TEXT

page 1

page 2

page 4

page 10

research
04/22/2019

On Learning Non-Convergent Short-Run MCMC Toward Energy-Based Model

This paper studies a curious phenomenon in learning energy-based model (...
research
05/24/2022

EBM Life Cycle: MCMC Strategies for Synthesis, Defense, and Density Modeling

This work presents strategies to learn an Energy-Based Model (EBM) accor...
research
05/30/2022

Mitigating Out-of-Distribution Data Density Overestimation in Energy-Based Models

Deep energy-based models (EBMs), which use deep neural networks (DNNs) a...
research
11/26/2021

Particle Dynamics for Learning EBMs

Energy-based modeling is a promising approach to unsupervised learning, ...
research
10/08/2020

No MCMC for me: Amortized sampling for fast and stable training of energy-based models

Energy-Based Models (EBMs) present a flexible and appealing way to repre...
research
09/19/2021

JEM++: Improved Techniques for Training JEM

Joint Energy-based Model (JEM) is a recently proposed hybrid model that ...
research
05/13/2022

A Tale of Two Flows: Cooperative Learning of Langevin Flow and Normalizing Flow Toward Energy-Based Model

This paper studies the cooperative learning of two generative flow model...

Please sign up or login with your details

Forgot password? Click here to reset