18

I am under the impression that an irreducible, finite Markov chain is necessarily positive recurrent. How might I show this?

Regards, Jon

JW1986
  • 385
  • 1
  • 2
  • 5

1 Answers1

20

You are right. What argument is useful for you depends on your definitions and what you've learned about Markov chains so far.

Here is one way to look at it. If $x$ is a null state, then the chain spends very little time in $x$, more precisely, $${1\over n}\sum_{j=1}^n 1_{[X_j=x]}\to 0 \text{ almost surely.} $$ Therefore, for any finite set $F$ of null states we also have $${1\over n}\sum_{j=1}^n 1_{[X_j\in F]}\to 0 \text{ almost surely.} $$

But the chain must be spending its time somewhere, so if the state space itself is finite, there must be a positive state. A positive state is necessarily recurrent, and if the chain is irreducible then all states are positive recurrent.

  • 1
    Hi Byron, thanks for the answer. To address your point: I am working with the definition that a state is positive recurrent if its expected return time is finite (implicitly: conditional on state at this state). – JW1986 Feb 08 '13 at 13:14
  • 4
    To turn the definition of null recurrent into the statement above that the fraction of visits to $x$ tends to zero almost surely, one needs to use the ergodic theorem for Markov chains. – gj255 Sep 09 '17 at 09:48