Deep Tempering

10/01/2014
by   Guillaume Desjardins, et al.
0

Restricted Boltzmann Machines (RBMs) are one of the fundamental building blocks of deep learning. Approximate maximum likelihood training of RBMs typically necessitates sampling from these models. In many training scenarios, computationally efficient Gibbs sampling procedures are crippled by poor mixing. In this work we propose a novel method of sampling from Boltzmann machines that demonstrates a computationally efficient way to promote mixing. Our approach leverages an under-appreciated property of deep generative models such as the Deep Belief Network (DBN), where Gibbs sampling from deeper levels of the latent variable hierarchy results in dramatically increased ergodicity. Our approach is thus to train an auxiliary latent hierarchical model, based on the DBN. When used in conjunction with parallel-tempering, the method is asymptotically guaranteed to simulate samples from the target RBM. Experimental results confirm the effectiveness of this sampling strategy in the context of RBM training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2022

Gaussian-Bernoulli RBMs Without Tears

We revisit the challenging problem of training Gaussian-Bernoulli restri...
research
07/19/2018

Approximate Collapsed Gibbs Clustering with Expectation Propagation

We develop a framework for approximating collapsed Gibbs sampling in gen...
research
10/21/2015

Application of Quantum Annealing to Training of Deep Neural Networks

In Deep Learning, a well-known approach for training a Deep Neural Netwo...
research
05/11/2023

Investigating the generative dynamics of energy-based neural networks

Generative neural networks can produce data samples according to the sta...
research
02/10/2017

A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines

Restricted Boltzmann machines (RBMs) are energy-based neural-networks wh...
research
11/16/2018

Deep Knockoffs

This paper introduces a machine for sampling approximate model-X knockof...
research
05/31/2017

Propositional Knowledge Representation in Restricted Boltzmann Machines

Representing symbolic knowledge into a connectionist network is the key ...

Please sign up or login with your details

Forgot password? Click here to reset