Sunday, 18 September 2016

Reducible state space Markov chains

While reviewing a paper last night, I started thinking about reducible state space Markov chains.  Most undergraduate probability students are fed theorems about irreducible Markov chains.  To summarize these theorems.

  • If two states communicate, then they are either both recurrent or both transient.
  • If a MC is irreducible and all states are positive recurrent, then there exists a unique stationary distribution.
  • If a MC is irreducible, aperiodic, and has a stationary distribution, then it converges to that.

We can ask a set of questions about reducible Markov chains to test our intuition for the subject.
  • Let's setup a trivial 4 state Markov chain where the first two and last two states form separate communicating classes.
  • How many stationary distributions exist?
  • Do limiting distributions exist?
  • Does the limit distribution depend on the initial distribution?  How many possibilities are there?

Let's try to answer them.
  • There are an infinite number of stationary distributions.  If you take the stationary distribution of the first class and the stationary distribution of the second class, then any linear combination (equal to 1) of these two would be a stationary distribution would be a stat. distribution of the full 4-state Markov chain.
  • Limiting distributions do exist.  Let's think about the intuition here.  If the total prob. mass on the first two states is m_0 and the rest is 1-m_0, then starting from this initial distribution, each iteration of the chain would "stir" this vector but only stir within their respective classes.  So the first two states of this vector would converge to its limit distribution and same with the last two.  This means the overall limit distribution would be a mixture of the individual limit distributions weighted by the (m_0, 1-m_0) vector.