DeepAI AI Chat
Log In Sign Up

Rosenthal-type inequalities for linear statistics of Markov chains

by   Alain Durmus, et al.

In this paper, we establish novel deviation bounds for additive functionals of geometrically ergodic Markov chains similar to Rosenthal and Bernstein-type inequalities for sums of independent random variables. We pay special attention to the dependence of our bounds on the mixing time of the corresponding chain. Our proof technique is, as far as we know, new and based on the recurrent application of the Poisson decomposition. We relate the constants appearing in our moment bounds to the constants from the martingale version of the Rosenthal inequality and show an explicit dependence on the parameters of the underlying Markov kernel.


page 1

page 2

page 3

page 4


Deviation inequalities for separately Lipschitz functionals of composition of random functions

We consider a class of non-homogeneous Markov chains, that contains many...

Exponential inequalities for nonstationary Markov Chains

Exponential inequalities are main tools in machine learning theory. To p...

Explicit convergence bounds for Metropolis Markov chains: isoperimetry, spectral gaps and profiles

We derive the first explicit bounds for the spectral gap of a random wal...

Deviation inequalities for stochastic approximation by averaging

We introduce a class of Markov chains, that contains the model of stocha...

Concentration without Independence via Information Measures

We propose a novel approach to concentration for non-independent random ...

Generalization Error Bounds on Deep Learning with Markov Datasets

In this paper, we derive upper bounds on generalization errors for deep ...

A quantitative Mc Diarmid's inequality for geometrically ergodic Markov chains

We state and prove a quantitative version of the bounded difference ineq...