Geometry and Expressive Power of Conditional Restricted Boltzmann Machines

by   Guido Montufar, et al.

Conditional restricted Boltzmann machines are undirected stochastic neural networks with a layer of input and output units connected bipartitely to a layer of hidden units. These networks define models of conditional probability distributions on the states of the output units given the states of the input units, parametrized by interaction weights and biases. We address the representational power of these models, proving results their ability to represent conditional Markov random fields and conditional distributions with restricted supports, the minimal size of universal approximators, the maximal model approximation errors, and on the dimension of the set of representable conditional distributions. We contribute new tools for investigating conditional probability models, which allow us to improve the results that can be derived from existing work on restricted Boltzmann machine probability models.


page 1

page 2

page 3

page 4


Restricted Boltzmann Machines: Introduction and Review

The restricted Boltzmann machine is a network of stochastic units with u...

Deep Narrow Boltzmann Machines are Universal Approximators

We show that deep narrow Boltzmann machines are universal approximators ...

The Recurrent Temporal Discriminative Restricted Boltzmann Machines

The recurrent temporal restricted Boltzmann machine (RTRBM) has been suc...

Stochastic Feedforward Neural Networks: Universal Approximation

In this chapter we take a look at the universal approximation question f...

Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units

We generalize recent theoretical work on the minimal number of layers of...

Maximal Information Divergence from Statistical Models defined by Neural Networks

We review recent results about the maximal values of the Kullback-Leible...

Please sign up or login with your details

Forgot password? Click here to reset