Truncated Variational Sampling for "Black Box" Optimization of Generative Models
We investigate the optimization of two generative models with binary hidden variables using a novel variational EM approach. The novel approach distinguishes itself from previous variational approaches by using hidden states as variational parameters. Here we use efficient and general purpose sampling procedures to vary the hidden states, and investigate the "black box" applicability of the resulting optimization procedure. For general purpose applicability, samples are drawn from approximate marginal distributions of the considered generative model and from the prior distribution of a given generative model. As such, sampling is defined in a generic form with no additional derivations required. As a proof of concept, we then apply the novel procedure (A) to Binary Sparse Coding (a model with continuous observables), and (B) to basic Sigmoid Belief Networks (which are models with binary observables). The approach is applicable without any further analytical steps and efficiently as well as effectively increases the variational free-energy objective.
READ FULL TEXT