On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case

05/30/2019
by   Ngoc Huy Chau, et al.
0

We consider the problem of sampling from a target distribution which is not necessarily logconcave. Non-asymptotic analysis results are established in a suitable Wasserstein-type distance of the Stochastic Gradient Langevin Dynamics (SGLD) algorithm, when the gradient is driven by even dependent data streams. Our estimates are sharper and uniform in the number of iterations, in contrast to those in previous studies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/06/2018

On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case

Stochastic Gradient Langevin Dynamics (SGLD) is a combination of a Robbi...
research
10/04/2019

Nonasymptotic estimates for Stochastic Gradient Langevin Dynamics under local conditions in nonconvex optimization

Within the context of empirical risk minimization, see Raginsky, Rakhlin...
research
05/04/2021

On the stability of the stochastic gradient Langevin algorithm with dependent data stream

We prove, under mild conditions, that the stochastic gradient Langevin d...
research
04/18/2023

Finite-Sample Bounds for Adaptive Inverse Reinforcement Learning using Passive Langevin Dynamics

Stochastic gradient Langevin dynamics (SGLD) are a useful methodology fo...
research
02/05/2019

Distribution-Dependent Analysis of Gibbs-ERM Principle

Gibbs-ERM learning is a natural idealized model of learning with stochas...
research
10/25/2021

On quantitative Laplace-type convergence results for some exponential probability measures, with two applications

Laplace-type results characterize the limit of sequence of measures (π_ε...
research
10/02/2020

Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction

Replica exchange stochastic gradient Langevin dynamics (reSGLD) has show...

Please sign up or login with your details

Forgot password? Click here to reset