Mini-batch Tempered MCMC

07/31/2017
by   Dangna Li, et al.
0

In this paper we propose a general framework of performing MCMC with only a mini-batch of data. We show by estimating the Metropolis-Hasting ratio with only a mini-batch of data, one is essentially sampling from the true posterior raised to a known temperature. We show by experiments that our method, Mini-batch Tempered MCMC (MINT-MCMC), can efficiently explore multiple modes of a posterior distribution. As an application, we demonstrate one application of MINT-MCMC as an inference tool for Bayesian neural networks. We also show an cyclic version of our algorithm can be applied to build an ensemble of neural networks with little additional training cost.

READ FULL TEXT
research
07/22/2019

Properties of the Stochastic Approximation EM Algorithm with Mini-batch Sampling

To speed up convergence a mini-batch version of the Monte Carlo Markov C...
research
10/17/2022

Data Subsampling for Bayesian Neural Networks

Markov Chain Monte Carlo (MCMC) algorithms do not scale well for large d...
research
08/08/2019

Mini-batch Metropolis-Hastings MCMC with Reversible SGLD Proposal

Traditional MCMC algorithms are computationally intensive and do not sca...
research
06/18/2021

An Investigation into Mini-Batch Rule Learning

We investigate whether it is possible to learn rule sets efficiently in ...
research
05/29/2019

Replica-exchange Nosé-Hoover dynamics for Bayesian learning on large datasets

In this paper, we propose a new sampler for Bayesian learning that can e...
research
05/10/2023

Phase transitions in the mini-batch size for sparse and dense neural networks

The use of mini-batches of data in training artificial neural networks i...
research
04/02/2023

Mini-batch k-means terminates within O(d/ε) iterations

We answer the question: "Does local progress (on batches) imply global p...

Please sign up or login with your details

Forgot password? Click here to reset