Energy Discrepancies: A Score-Independent Loss for Energy-Based Models

07/12/2023
by   Tobias Schröder, et al.
0

Energy-based models are a simple yet powerful class of probabilistic models, but their widespread adoption has been limited by the computational burden of training them. We propose a novel loss function called Energy Discrepancy (ED) which does not rely on the computation of scores or expensive Markov chain Monte Carlo. We show that ED approaches the explicit score matching and negative log-likelihood loss under different limits, effectively interpolating between both. Consequently, minimum ED estimation overcomes the problem of nearsightedness encountered in score-based estimation methods, while also enjoying theoretical guarantees. Through numerical experiments, we demonstrate that ED learns low-dimensional data distributions faster and more accurately than explicit score matching or contrastive divergence. For high-dimensional image data, we describe how the manifold hypothesis puts limitations on our approach and demonstrate the effectiveness of energy discrepancy by training the energy-based model as a prior of a variational decoder model.

READ FULL TEXT

page 6

page 7

page 8

page 32

page 34

page 35

page 36

page 37

research
07/14/2023

Training Discrete Energy-Based Models with Energy Discrepancy

Training energy-based models (EBMs) on discrete spaces is challenging be...
research
06/10/2021

Score Matching Model for Unbounded Data Score

Recent advance in score-based models incorporates the stochastic differe...
research
02/13/2020

Cutting out the Middle-Man: Training and Evaluating Energy-Based Models without Sampling

We present a new method for evaluating and training unnormalized density...
research
05/17/2019

Sliced Score Matching: A Scalable Approach to Density and Score Estimation

Score matching is a popular method for estimating unnormalized statistic...
research
04/15/2021

On Energy-Based Models with Overparametrized Shallow Neural Networks

Energy-based models (EBMs) are a simple yet powerful framework for gener...
research
12/28/2018

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model

This paper proposes the divergence triangle as a framework for joint tra...
research
10/08/2020

No MCMC for me: Amortized sampling for fast and stable training of energy-based models

Energy-Based Models (EBMs) present a flexible and appealing way to repre...

Please sign up or login with your details

Forgot password? Click here to reset