Spread Divergences

by   David Barber, et al.

For distributions p and q with different support, the divergence generally will not exist. We define a spread divergence on modified p and q and describe sufficient conditions for the existence of such a divergence. We give examples of using a spread divergence to train implicit generative models, including linear models (Principal Components Analysis and Independent Components Analysis) and non-linear models (Deep Generative Networks).


page 8

page 13

page 14

page 15


Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

The spectacular success of deep generative models calls for quantitative...

Training Deep Energy-Based Models with f-Divergence Minimization

Deep energy-based models (EBMs) are very flexible in distribution parame...

Maximum information divergence from linear and toric models

We study the problem of maximizing information divergence from a new per...

Conditions for the existence of a generalization of Rényi divergence

We give necessary and sufficient conditions for the existence of a gener...

Training of CC4 Neural Network with Spread Unary Coding

This paper adapts the corner classification algorithm (CC4) to train the...

On the Expressive Efficiency of Sum Product Networks

Sum Product Networks (SPNs) are a recently developed class of deep gener...

Manifold Topology Divergence: a Framework for Comparing Data Manifolds

We develop a framework for comparing data manifolds, aimed, in particula...

Please sign up or login with your details

Forgot password? Click here to reset