# Various proofs of the Fundamental Theorem of Markov Chains

This paper is a survey of various proofs of the so called fundamental theorem of Markov chains: every ergodic Markov chain has a unique positive stationary distribution and the chain attains this distribution in the limit independent of the initial distribution the chain started with. As Markov chains are stochastic processes, it is natural to use probability based arguments for proofs. At the same time, the dynamics of a Markov chain is completely captured by its initial distribution, which is a vector, and its transition probability matrix. Therefore, arguments based on matrix analysis and linear algebra can also be used. The proofs discussed below use one or the other of these two types of arguments, except in one case where the argument is graph theoretic. Appropriate credits to the various proofs are given in the main text. Our first proof is entirely elementary, and yet the proof is also quite simple. The proof also suggests a mixing time bound, which we prove, but this bound in many cases will not be the best bound. One approach in proving the fundamental theorem breaks the proof in two parts: (i) show the existence of a unique positive stationary distribution for irreducible Markov chains, and (ii) assuming that an ergodic chain does have a stationary distribution, show that the chain will converge in the limit to that distribution irrespective of the initial distribution. For (i), we survey two proofs, one uses probability arguments, and the other uses graph theoretic arguments. For (ii), first we give a coupling based proof (coupling is a probability based technique), the other uses matrix analysis. Finally, we give a proof of the fundamental theorem using only linear algebra concepts.

READ FULL TEXT